• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sumo: Wii U specs are surprising; "way more memory" than PS3/360

I find it hard to believe that, by 2014/2015, games aren't going to look a hell of a lot better than anything that we've seen on current gen consoles.

One of the things I noticed in past threads about Samaritan and the Elemental demo for example is that art style affects perception. There were quite a few posts where people couldn't separate the art style from the tech and what that tech was accomplishing in those demos. I don't see art styles for mainstream games having a dramatic shift. And if we are seeing people that post here having trouble seeing it, then what about the majority that don't post on message boards? Some of these late gen PS3 games (if they are staying on PS3) aren't going to help either.

I'd actually agree to an extent, but trying telling some PC guy's that. To them, BF3 console, or choose your console version here, might as well be an 8 bit Nintendo game or something. Never mind the horrors of 30 FPS and lowly 720P (which I have to agree, after watching my brother PC game a lot, I am ready for a move to 1080P, which 720P never bothered me before).

In other words ironically I'm typically on your side, arguing that the console version of game X is not THAT far behind the PC version. I'd even argue that for BF3, to an extent. I think this because PC ports are based on the console versions though, not because of any diminishing returns factor.

But I was speaking of Crysis 3 anyways, and again, that trailer, just the textures alone...you run that in 1080P and I think you're in the ballpark of next gen. Though I think next gen will far exceed it eventually (IE, what Killzone 2 was to 1st gen 360 games)

Watch Dogs and Star Wars 1313 are other good examples of sort of next gen, running now. They impress me much much more than say, UE4 demo because they are real.

But I'm not talking about the PC guy. I'm talking about the person who most likely isn't spending the money to build a gaming PC. Just like how CliffyB talked about the "Mom test".

Wii U is rumored to have Compute Shaders ( Shader Model 5), and compiling Shader Model 4 code to run on Shader Model 5 has a negative performance impact.

Could this be the reason why WiiU seems weaker than it's true potential? Poor ports and a lack of understanding of next gen architecture?

That claim comes directly from the target specs so I think we can move it past rumor stage.

And I wouldn't call it a lack of understanding. If anything it's a lack of time to port code to the GPU for straight ports. This is why future ports, where things are pointing to this being direction for all the next consoles, should be easier to port to Wii U than current ports that are most likely heavily reliant on VMX128 and SPEs in PS360.

Question. If the 750 line from IBM has been pretty much phased out and and aside from backwards compatibility, why try to go with that line? Why not build the backwards comp. into a new chip? From what I have read and understand, (though it isnt much) IBM never intended the 750 family to go smaller than 90nm. So instead of retooling and enhancing and shrinking broadway, why not go with a new chip?

That's what makes the 476FP very likely in this case.
 

onQ123

Member
Ding Ding Ding! We have a winner. It looks like that is exactly what they did. They made a tri core "Enhanced" Broadway.I understand it for the "wii compatibility mode" but good lord is it a terrible idea for anything else.

Now I'm shady? Its not my fault I am not an engineer and call them like the docs say. About the cpu I am sure of this. The term "enhanced broadway" is straight from nintendo.


Wii+U+CPU.jpg



Wii U CPU is 3 Wii CPU Cores clocked a bit higher "enhanced broadway"


I always believed
 
There's a high degree of confirmation bias on all sides, really. But it will be interesting when the Wii U thread moves out of the safe harbour of Community and into Gaming side, where people are not all as enamoured with the system and the propensity for group think will be greatly diminished.

People who arent enamored should take their posts to threads that can actually benefit from them.

If you are a gamer who simply must have the best graphical representation possible in a very spec oriented way, buy a PC. Or wait for the next xbox or PS and decide between the two based on their specs. Nintendo has no intention of competing in that way. This is a known fact restated 10000000 times on this forum and others. Its been their method for over half a decade. To continue to stress over this fact is insane. Their concept of what they consider good enough is exactly in line with what they are doing. There is no "fail" on their part. They are designing a specific tool for a specific job. They have made a can opener that opens cans as competently as it ever needs to based on the history of successful can opening. Just because a bunch of people in a forum are pissed it wont also toast their bread too doesnt make it a shitty can opener. You dont see me in xbox threads talking about how terrible the consoles are because they dont play mario games. I dont see why anyone would come into a nintendo console thread going "but but but its not even the most powerful console out!!!!!!".

What a complete waste of their time, and others.
 

onQ123

Member
You forgot to post the bit where bgassassin says that the info from his inside sources contradict that. Who is he anyway? Espresso? 5 posts? What? No reliable info on the WiiU CPU exists except the little bit on the Dev Kit leak - just mountains of speculation. Gettin' sick of this shit.

What post? I didn't see it can you post a link?
 

hodgy100

Member
I thought we've been told that it is an out of order CPU and therefore does not have the limitations that the cpu in the ps3 and 360 have of being in order cpu's.

If that is true a lower clock speed isn't necessarily a bad thing.
 

Doc Holliday

SPOILER: Columbus finds America
No way this true, why in the world would Nintendo do that? If anything wouldn't it be cheaper to use a more conventional CPU?
 

Mr Swine

Banned
No way this true, why in the world would Nintendo do that? If anything wouldn't it be cheaper to use a more conventional CPU?

Not only that, the Wii CPU is so old that shrinking them to 45/32nm and slapping 3 into a Tri core CPU doesn't even make sense. Even if Nintendo did that it would be so incredible slow/weak compared to the CPU on the Xbox360 and PS3
 
You forgot to post the bit where bgassassin says that the info from his inside sources contradict that. Who is he anyway? Espresso? 5 posts? What? No reliable info on the WiiU CPU exists except the little bit on the Dev Kit leak - just mountains of speculation. Gettin' sick of this shit. Edit, oh, and couple of on record quotes from developers.
Actually, due to arkam telling us where the "enchanted broadway" statement came from, this goes well with what expresso said, and forms a possible connection with the other sources. The issue, though, is that Broadway wasn't able to go much above 1GHz and can't have multi-cores due to design, so it literally can't just be "3 broadways stuck together and clocked a little higher." I talked to Bgassassin about this eariler, and both of us agree that this info is now making it more likely that the Wii U is designed close to the 476FP. It is OoOE, multi-core enabled, not able to be clocked as high as 360's CPU, and can be roughly described as a "enchanted Broadway."
 

wsippel

Banned
Ding Ding Ding! We have a winner. It looks like that is exactly what they did. They made a tri core "Enhanced" Broadway.I understand it for the "wii compatibility mode" but good lord is it a terrible idea for anything else.
Terrible idea? Dunno. The design is pretty good, and it has some interesting, unique features they'd need to keep to achieve compatibility. Depends on how they've enhanced it - Intel Core is heavily modified Pentium M architecture if I remember correctly, and we probably all agree that it's far superior to the Pentium 4 architecture even though it's really, really old. Wonder how they did the SMP, though. Some dude with a terrible grasp of the English language posted early last year that it was, in fact, a heavily modified 750CXe core, using a ringbus for SMP. That sounds pretty damn weird, as the only other ringbus design I'm aware of was Larrabee.
 

tkscz

Member
Not only that, the Wii CPU is so old that shrinking them to 45/32nm and slapping 3 into a Tri core CPU doesn't even make sense. Even if Nintendo did that it would be so incredible slow/weak compared to the CPU on the Xbox360 and PS3

I wouldn't say incredibly slower.

So far based on my tests, the Wii is definitely a lot faster than the Xbox 1. Not a little bit, but a lot.

Super Street Fighter II X Turbo Revival on VBA Next runs at around 59/58/60fps vs. 43/42fps on Xbox 1.
Virtua Racing on Genesis Plus GX (RetroArch Wii) runs at 60fps vs. 45fps or so on Xbox 1.

And SNES9x Next has a similar speed difference.

Really, the Wii has an excellent processor given the clock speed and Nintendo's modesty in advertising it (as in - not hyping up the tech specs to any degree). If any console manufacturer should be ashamed for their tech choices this generation, it should have been Sony and MS which opted for bottlenecked-like-hell in-order CPUs that run no better than a Pentium 4 CPU (which is bad in and of itself given the lifecycle they expected these consoles to have). It's amazing that other than a few Anandtech articles back in 2005/2006, nobody in the game development circles has ever blasted Microsoft/Sony's consoles for having such weak CPUs in the first place given the clock speed and the marketing hype surrounding them. The entire reason you have this SPU infatuation going on on the PS3 is because the main CPU is so utterly weak that it has no chance in hell to compete with the 360 (which is weak enough as it is but has at least 3 'weak' main CPU cores instead of just 1 weak main one) if it were not possible to fall back on those SPUs -and even those SPUs have no purpose if you don't have a lot of heavy-duty tasks to off-load from the main CPU.

I believe if you program only against one main CPU (like we do for pretty much most emus), you would find that the PS3/Xenon CPUs in practice are only about 20% faster than the Wii CPU.

I've ported the same code over to enough platforms by now to state this with confidence - the PS3 and 360 at 3.2GHz are only (at best - I would stress) 20% faster than the 729Mhz out-of-order Wii CPU without multithreading (and multithreading isn't a be-all end-all solution and isn't a 'one size fits all' magic wand either). That's pretty pathetic considering the vast differences in clock speed, the increase in L2/L1 cache and other things considered - even for in-order CPUs, they shouldn't be this abysmally slow and should be totally leaving the Wii in the dust by at least 50/70% difference - but they don't.

BTW - if you search around on some of the game development forums you can hear game developers talking amongst themselves about how crap the 360/PS3 CPUs were to begin with. They were crap from the very first minute the systems were launched - with MS hardware executives (according to some 360 'making of' book) allegedly freaking out when IBM told them they would be getting in-order CPUs for their new console - which caused them to place an order to have three 'cores' instead of one because one core would be totally pathetic (pretty much like the PS3 then where you only have one main processor and 6/7 highly specialized 'vector' SIMD CPUs that are very fast but also very low on individual RAM and essentially have to be able to do some heavy code weightlighting for you to gain anything). Without utilizing multithreading, you're essentially looking at the equivalent of Pentium 4-spec consoles that have to be helped along by lots of vector CPUs (SPUs) and/or reasonably mid-specced, highly programmable GPUs (which the Wii admittedly lacks).
 

Suairyu

Banned
No, far from that. It was a poor port which looked and performed better than 360 on res alone but had shortcoming from the consoles. Simple as that.

In short, SUMO is not tech savvy enough to make a competent looking game as it is. Even with powerful PC HW. They can make games though so don't get all bent like I'm insulting their work.
What PC version did you play?

Locked 60fps framerate, better res and graphics options.

Only thing consoles had over PC was online multiplayer (kart racers are all about the local multi anyway) and the option to pay for an overpriced DLC pack.
 

mrklaw

MrArseFace
DSPs, and custom chips in general are a good idea for performing specific tasks. I've loved the idea of having custom chips since the 8-16-bit generations and, I particularly liked the way they were implemented on the Amiga, but it doesn't make much sense nowadays. Back in those days, CPUs were quite weak, so it made a lot of sense to have custom chips perform critical tasks, but it's not particularly cost-effective anymore. The industry has been moving away from that concept for a while now. Even modern GPUs are massively parallel CPUs that have moved from being fixed-function to fully programmable, and full convergence between CPUs and GPUs seems inevitable down the line.

I don't think there's any need for DSPs with modern CPUs and GPUs having enough power and being more flexible, but it's nice to have it on the Wii U, seeing as its CPU seems to be a bit on the weak side compared to the rest of the system. The problem right now is that developers, particularly those working on launch titles, might disregard it for lack of time.




That's quite a strange wording. You would be better off simply saying that the Wii U's is a DX11-level GPU, whereas RSX-Xenos are DX9.0c+ level GPUs.

Fine, just don't use x86 or the AMD equivalent where you have goodness knows how much of the silicon wasted due to legacy support which simply isn't needed for a console environment.
 
that was about the CPU having 2 threads per core, & you're ignoring the fact that he said the same thing that is being said now about the CPU being a enhanced Wii CPU with more cores.

Point is there's conflict of 'source' info. Do you know 5 post Espresso personally or something? I mean, who the fuck?

Actually, due to arkam telling us where the "enchanted broadway" statement came from, this goes well with what expresso said, and forms a possible connection with the other sources. The issue, though, is that Broadway wasn't able to go much above 1GHz and can't have multi-cores due to design, so it literally can't just be "3 broadways stuck together and clocked a little higher." I talked to Bgassassin about this eariler, and both of us agree that this info is now making it more likely that the Wii U is designed close to the 476FP. It is OoOE, multi-core enabled, not able to be clocked as high as 360's CPU, and can be roughly described as a "enchanted Broadway."

Would there be any reason to describe this to developers as such? Would they some how go 'ohhh right' and take experience from Wii development over to a possible 'enhanced Broadway'?


For the same reason that new Wii model removed the GC backwards compatibility?

Was there ever anything official said about that? GC games get VC'ed on the Wii U they're gonna have to map controls to the uPad.
 

onQ123

Member
Point is there's conflict of 'source' info. Do you know 5 post Espresso personally or something? I mean, who the fuck?

why are you focused on the 1st guys post count? the point is that it's a new source saying the same thing & this source is known to have inside info.

but you can continue to ignore that & worry about Espresso's post count if you like have fun with it do a recount of his post check his 5 post history send it to PostFax it doesn't matter.
 

tkscz

Member
I don't know, I have a hard time believing that a old CPU created in early 2001 is only 20% slower than a CPU that is a few generations newer

If the CPUs in question were horribly made, it's possible. The guy uses Pentium 4 as an example and for good reason. Pentium 4 was originally made so badly, that some Pentium 3s were better CPUs.
 

beje

Banned

So, can we safely say that even if the clock speed is on par or even slower than a Xenos, the WiiU CPU works much better due to being OOE? In that case (just pure speculation, my CPU knowledge is rather limited) would it mean that just some (maybe most) of the FUD against the machine comes from lazy devs throwing at the Wii U code optimized for in-oder execution (PS3/360) and thus, not taking advantage of out-of-order and expecting it to work flawlessly? Would that also explain really poor console to PC ports this gen?
 

mrklaw

MrArseFace
Surely what you want from a console is power for the tasks you need - graphics obviously, sound, physics, AI, housekeeping etc. seems to me a stream processor - be it a CELL SPE or GPGPU unit - would be good for most of those. So you could not bother with a powerful CPU, and just have a big GPU and pile it all on there. But then you have less processing power for the graphics because units are working on other stuff. But likewise it doesntossed an efficient use of your silicon budget to have an entire x86 core dedicated to sound processing - i think that's why CELL worked well.

I'd almost be tempted to have a pseudo SLI approach and a simple CPU to handle scheduling etc. one big fat GPU for graphics which are clearly important, and a smaller GPU customised for computing, for physics, sound processing etc. Kind of like PC users keeping an old nvidia card installed for physx duties
 
why are you focused on the 1st guys post count? the point is that it's a new source saying the same thing & this source is known to have inside info.

but you can continue to ignore that & worry about Espresso's post count if you like have fun with it do a recount of his post check his 5 post history send it to PostFax it doesn't matter.

Yeah, sound, I'll sign up and post the Wii U has nano bot technology trained by leprechauns so you can make a thread about it. The Easter bunny? Sorry to break it to you.
 

tkscz

Member
So, can we safely say that even if the clock speed is on par or even slower than a Xenos, the WiiU CPU works much better due to being OOE? In that case (just pure speculation, my CPU knowledge is rather limited) would it mean that just some (maybe most) of the FUD against the machine comes from lazy devs throwing at the Wii U code optimized for in-oder execution (PS3/360) and thus, not taking advantage of out-of-order and expecting it to work flawlessly? Would that also explain really poor console to PC ports this gen?

Theoretically, yes. This has been brought up in the WiiU thread several times. if most ports from the PS360 are just ports, then this would be the case. Now, we don't know if devs have actually made the code for OoE or not.
 

Steph_E.

Member
People who arent enamored should take their posts to threads that can actually benefit from them.

If you are a gamer who simply must have the best graphical representation possible in a very spec oriented way, buy a PC. Or wait for the next xbox or PS and decide between the two based on their specs. Nintendo has no intention of competing in that way. This is a known fact restated 10000000 times on this forum and others. Its been their method for over half a decade. To continue to stress over this fact is insane. Their concept of what they consider good enough is exactly in line with what they are doing. There is no "fail" on their part. They are designing a specific tool for a specific job. They have made a can opener that opens cans as competently as it ever needs to based on the history of successful can opening. Just because a bunch of people in a forum are pissed it wont also toast their bread too doesnt make it a shitty can opener. You dont see me in xbox threads talking about how terrible the consoles are because they dont play mario games. I dont see why anyone would come into a nintendo console thread going "but but but its not even the most powerful console out!!!!!!".

What a complete waste of their time, and others.

I agree with this 100%. It should be quoted in response to all trolls.
 
Was there ever anything official said about that? GC games get VC'ed on the Wii U they're gonna have to map controls to the uPad.

Officially said about what?

And although the Wii U controller has more buttons than the Gamecube controller, the lack of analogue triggers could be a problem for a lot of games. I reckon they'll release whole new HD versions with reworked controls - much more lucrative for practically no extra effort.
 
Officially said about what?

And although the Wii U controller has more buttons than the Gamecube controller, the lack of analogue triggers could be a problem for a lot of games. I reckon they'll release whole new HD versions with reworked controls - much more lucrative for practically no extra effort.

About why they culled BC on Wii. Was it just production costs of controller and mem card slots, and were any chips removed from the PCB?
 
Point is there's conflict of 'source' info. Do you know 5 post Espresso personally or something? I mean, who the fuck?



Would there be any reason to describe this to developers as such? Would they some how go 'ohhh right' and take experience from Wii development over to a possible 'enhanced Broadway'?

We have been wondering where to place expresso's statements before, but what arkam said matches that info, and Arkam is verified by the mods. What Arkam clarified that the statement is not from an engineering level, which was one of the original major issues.

The "enchanted Broadway" statement may only really benefit devs that did take the time to push the Gamecube/Wii, like Nintendo's first party and capcom. For others, it is more like a warning that "this CPU is nothing like xenon or cell", since they are accustomed to work with the strengths and weaknesses of those chips.
 

LCGeek

formerly sane
That would be fine if Nintendo lived in a vacuum, but there are other factors, such as third parties who aren't pleased at all with the Wii.

You mean third parties who do nothing with 3d nintendo systems for the most part. You can literally with two hands list the third parties from n64 till now who have even really been bothering to exploit the system.

So what they are rarely pleased and still shove garbage ports on to the system despite their displeasure.
 

onQ123

Member
Yeah, sound, I'll sign up and post the Wii U has nano bot technology trained by leprechauns so you can make a thread about it. The Easter bunny? Sorry to break it to you.

how about you look back a few pages & see that the info is coming from someone other than the person with 5 post.
 

antonz

Member
Its technologically impossible for the WiiU CPU to be Wii cores. Its that simple unless Nintendo has worked some witchcraft and Broadway suddenly became a monster.
 
I'd almost be tempted to have a pseudo SLI approach and a simple CPU to handle scheduling etc. one big fat GPU for graphics which are clearly important, and a smaller GPU customised for computing, for physics, sound processing etc. Kind of like PC users keeping an old nvidia card installed for physx duties

this is actually what I had in mind around 2007, but I suppose the cost of 2 GPUs is just too much?
 

OniShiro

Banned
You mean third parties who do nothing with 3d nintendo systems for the most part. You can literally with two hands list the third parties from n64 till now who have even really been bothering to exploit the system.

So what they are rarely please and still shove garbage ports on to the system despite their displeasure.
Wii ports are so bad because of tech limitations, if it was in the same league as the PS360 ports would have been identical for the 3 consoles, but it's not profitable to redo a whole game just for the Wii because assets and code can't be reused.
 

Sid

Member
Wii ports are so bad because of tech limitations, if it was in the same league as the PS360 ports would have been identical for the 3 consoles, but it's not profitable to redo a whole game just for the Wii because assets and code can't be reused.
It's not just specs which determine whether a system will get a port or not if that was the case how would you describe the 3rd party support of n64 and gamecube?
 
I agree with this 100%. It should be quoted in response to all trolls.

It's a complete waste of everyone's time to come into a thread about technical specifications and complain "who cares about specs?" If you don't want to discuss them, there are dozens of other threads to read.
 

OniShiro

Banned
It's not just specs which determine whether a system will get a port or not if that was the case how would you describe the 3rd party support of n64 and gamecube?
I'm not saying that's only specs, but if specs are so different to begin with then it's much harder to get ports instead of toned down versions.
 

Rolf NB

Member
About why they culled BC on Wii. Was it just production costs of controller and mem card slots, and were any chips removed from the PCB?
Just the ports, and something about the disc drive being able to eat the (smaller) GC discs. Wii CPU/GPU are completely identical to Gamecube's with a 50% bump in clock speed. The only architectural differences are
1)what used to be Gamecube's main memory is now a single chip merged onto the GPU package
2)the old A-RAM has been replaced with a single (bigger) GDDR3 chip
3)separate embedded management processor running exclusively OS stuff (IOS)
 
Top Bottom