• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Thraktor

Member
If my understanding of the bitcoin mining algorithm is correct, I'm pretty sure memory use is so small to the point of being trivial, so I doubt the performance difference is due to a change in the memory hierarchy, and R800 to GCN was the bigger change in that regard anyway. Incidentally, I'm pretty sure the very small memory footprint is one of the reasons it runs so well on GPUs.

Anyway, if I were to guess, then I'd say R800 introduced some new integer or bitwise operations that would help bitcoin mining considerably. These aren't the kind of operations that would be used that often in gaming compute tasks, but I suppose it goes to show how wide a class of things "GPGPU" can be, and how different aspects of the GPU affect different compute loads very, well... differently.
 

japtor

Member
Actually, after taking into account Jeffa's comment, we can easily calculate the extra space used up by the eDRAM cache over the SRAM cache as just 2.54mm². That brings us back to the point where things don't really add up.
Did you take into account whatever extra mystery logic on die (or per core?) needed to make the whole thing into a multicore implementation (of a historically single core design) in the first place?
 

wsippel

Banned
Did you take into account whatever extra mystery logic on die (or per core?) needed to make the whole thing into a multicore implementation (of a historically single core design) in the first place?
The system supposedly uses a ringbus for SMP instead of the conventional crossbar design - which is quite unusual as far as I can tell and I dismissed the rumor when it first popped up more than a year ago, but as the rest of the rumor was pretty much spot-on... No idea how much logic that would add.
 

Earendil

Member
The system supposedly uses a ringbus for SMP instead of the conventional crossbar design - which is quite unusual as far as I can tell and I dismissed the rumor when it first popped up more than a year ago, but as the rest of the rumor was pretty much spot-on... No idea how much logic that would add.

Didn't Cell use a ring bus for the SPEs?
 

wsippel

Banned
Didn't Cell use a ring bus for the SPEs?
Dunno. Well, some DSPs use ringbusses, and CELL is basically a DSP, so... maybe? As a layman, I would think that ringbusses mostly make sense for chips with exactly three cores, as it should allow every core to communicate with every other core directly.
 

Thraktor

Member
Didn't Cell use a ring bus for the SPEs?

I think so, yes. I don't really see an issue of using a ring bus for a 3-core CPU. The topology of the bus isn't much of a concern with so few cores, and the proportion of the die dedicated to the interconnect will be pretty low in any case, I'd imagine.
 

Ein Bear

Member
I find it very strange how everything to do with Wii backwards compatibility has to be done by booting up the entire Wii OS and then accessed through that.

I don't understand why the OS stuff can't be going in the background? The 3DS, for example, switches to 'DS Mode' when it's playing BC games, just like the Wii U. But you can still just play the game from the main 3DS menu like a regular 3DS game card, can download DSiWare from the main eShop and place it into folders, and everything still works with system features like the activity log.

If I want to play a Wii disc though, I have to select 'Wii Mode' on the Wii U, boot up the entire Wii OS, and then select the game from in there. Same thing if I want to play a Virtual Console title, or Wii Ware.

Do you guys know any tech reason why this is the case? The whole thing seems really weird to me. Hardware BC has been done plenty of times before, but it's never been quite as closed-off from the main system features as this.
 

wsippel

Banned
I find it very strange how everything to do with Wii backwards compatibility has to be done by booting up the entire Wii OS and then accessed through that.

I don't understand why the OS stuff can't be going in the background? The 3DS, for example, switches to 'DS Mode' when it's playing BC games, just like the Wii U. But you can still just play the game from the main 3DS menu like a regular 3DS game card, can download DSiWare from the main eShop and place it into folders, and everything still works with system features like the activity log.

If I want to play a Wii disc though, I have to select 'Wii Mode' on the Wii U, boot up the entire Wii OS, and then select the game from in there. Same thing if I want to play a Virtual Console title, or Wii Ware.

Do you guys know any tech reason why this is the case? The whole thing seems really weird to me. Hardware BC has been done plenty of times before, but it's never been quite as closed-off from the main system features as this.
I assume they wanted to completely sandbox everything, considering the Wii was hacked to hell and back.
 
Well Im glad you brought this up.
What if many games are not performing as they could because
developers didnt get the latest dev kits or didn't target game
development on the latest dev kits?

I was wondering this as well for the ports. It'll be interesting to see how upcoming ports run compared to the launch games. What are some of the big upcoming third-party releases?
 

MDX

Member
I love that it hardly stressed out a i7-3770K 3.5GHz.

We know that the console environment works differently then a PC environment.


I was wondering this as well for the ports. It'll be interesting to see how upcoming ports run compared to the launch games. What are some of the big upcoming third-party releases?


Aliens Colonial Marines, Ghost Recon Online, Project Cars,
Injustice: Gods Among Us, Devil's Third (likely), Homefront 2
 
I was wondering this as well for the ports. It'll be interesting to see how upcoming ports run compared to the launch games. What are some of the big upcoming third-party releases?
MDX said:
Aliens Colonial Marines, Ghost Recon Online, Project Cars,
Injustice: Gods Among Us, Devil's Third (likely), Homefront 2
aside from those, if the screenshot of the hacked Miiverse debug menu Trike took is true, we are also getting Metal Gear Solid, Resident Evil, and Final Fantasy III.
 

Thraktor

Member
Aliens Colonial Marines, Ghost Recon Online, Project Cars,
Injustice: Gods Among Us, Devil's Third (likely), Homefront 2

Splinter Cell Blacklist also. And NFS, but that was apparently ready for launch.

That said, I don't think one or two months will make a big difference. Late next year we'll see games that spent most of their dev cycle on final or near-final hardware, and I wouldn't draw any firm conclusions until then.
 
Splinter Cell Blacklist also. And NFS, but that was apparently ready for launch.

That said, I don't think one or two months will make a big difference. Late next year we'll see games that spent most of their dev cycle on final or near-final hardware, and I wouldn't draw any firm conclusions until then.

Since when is Splinter Cell coming to Wii U?
 

IdeaMan

My source is my ass!
So what you are saying, there is a game out there that, if they hadn't fixed those problems, would have been performing very badly??

Well, between 20 and 40% additional power from the CPU doesn't translate into 20-40% more framerate. But it was a nice boost for what the dev kit can deliver.
 
Well, between 20 and 40% additional power from the CPU doesn't translate into 20-40% more framerate. But it was a nice boost for what the dev kit can deliver.

Hey Ideaman, was this the same project that you referring to when you said that devs had doubled their framerate? Was the initial report a mistranslation? Or are we talking two different games here?
 

deviljho

Member
Aliens Colonial Marines

When Randy said the Wii U version would be the best looking console version, wasn't that around the time of an earlier devkit version? i.e. Maybe he's just doing PR, will have to wait and see how to looks anyway.
 
Thank you all for keeping this thread civil and informative. There are a lot of posts that I will need to review again to register all of the knowledge that has been passed out.

Well, between 20 and 40% additional power from the CPU doesn't translate into 20-40% more framerate. But it was a nice boost for what the dev kit can deliver.
Thanks for the info, IM. Did they say anything about the game getting enhanced more if they had more time to take advantage of that power?

Also, do you know anything about the "locked features" in the dev kits that Matt hinted at?
 
When Randy said the Wii U version would be the best looking console version, wasn't that around the time of an earlier devkit version? i.e. Maybe he's just doing PR, will have to wait and see how to looks anyway.

What's stopping the Wii U from being the best looking version? From the looks of it, that game isn't going to be running a 100 characters on screen at once, so the CPU may not be a bottleneck. Looks like a corridor shooter, so the the extra RAM could be used to load in the next room I'd suppose.
 
When Randy said the Wii U version would be the best looking console version, wasn't that around the time of an earlier devkit version? i.e. Maybe he's just doing PR, will have to wait and see how to looks anyway.
Perhaps, but we do know that the system was weaker during the eariler devkits.
 
Nothing. I was curious about the timing of his comment with respect to the devkit at the time.

Gotcha. Even if they did have the slower kits back then, isn't it possible they had an indicator that things would be sped up? I'm sure he also knew the limited RAM bandwidth and all that. It's also worth noting that even with only 320 shaders at 400 Mhz, "Latte" would have a slight edge over Xenos.
 

TheD

The Detective
We know that the console environment works differently then a PC environment.

An i7-3770K is an order of magnitude faster than the WiiU CPU and that is a tech demo without running any game logic.

And that would be by far the least of the WiiU's problems trying to run Agni's Philosophy.

Anyone that thinks it can run it while looking anywhere near close to what we have seen is high.

Will the next iPad be more powerful than the wiiu.

That's the question dictating my wallets.

CPU yes,
GPU probably not.

(and tablets suck at gaming).
 

IdeaMan

My source is my ass!
Hey Ideaman, was this the same project that you referring to when you said that devs had doubled their framerate? Was the initial report a mistranslation? Or are we talking two different games here?

Yeah it's the same projects. But from what i've understood, this unlock of additional power from the CPU (related to their programming i insist) was just one of the reason behind the improvements they have reached, the main ones being the mass production dev kit, new SDK, software optimizations, etc. Let's say this nice boost triggered 5 or 10 more fps in the roughly 30 additional ones they managed to get.

Oh and it's not related to the increase of the CPU frequencies between two versions of the dev kit.
 

Orayn

Member
Am I completely off base in explaining to a friend that porting Durango/Orbis games to Wii U will be more like putting Half-Life 2, Far Cry, and Doom 3 on the original Xbox than porting 360/PS3 games to the Wii? It was the simplest way I could describe there being a bigger difference in hardware power than basic compatibility.
 

MDX

Member
An i7-3770K is an order of magnitude faster than the WiiU CPU and that is a tech demo without running any game logic.

But the CPU was not maxed out.
It would be different if they said that the demo maxed out a CPU like
an i7, then I could see how it would be an issue for the WiiU CPU.
 

MDX

Member
Am I completely off base in explaining to a friend that porting Durango/Orbis games to Wii U will be more like putting Half-Life 2, Far Cry, and Doom 3 on the original Xbox than porting 360/PS3 games to the Wii? It was the simplest way I could describe there being a bigger difference in hardware power than basic compatibility.

I would suggest a wait and see approach.
Until the consoles are revealed, anything is possible.
Focus more on the fact the WiiU will have a year's worth, or two,
more games than the other consoles by the time they launch.
 
Am I completely off base in explaining to a friend that porting Durango/Orbis games to Wii U will be more like putting Half-Life 2, Far Cry, and Doom 3 on the original Xbox than porting 360/PS3 games to the Wii? It was the simplest way I could describe there being a bigger difference in hardware power than basic compatibility.

E3 2013 will reveal a lot and disappoint a great many people on this forum.

Your going to see for the first time, exclusive U games, with a decent budget, that were built using the final devkits, im predicting right now that the likes of Zelda / 3D Mario / Metroid U will look just as good, if not better than PS4 / 720 exclusive launch games.

The big Nintendo exclusives built from the ground up on final devkits will make NSMB U, Nintendo Land, Pikmin 3, ZombiU, W-101 and the Zelda HD tech demo look like 'last gen' games imo.

Of course at E3 2014 we will start to see PS4 / 720 flex their graphical muscles after developers have had enough time with final dev kits and not been rushed for launch / Xmas.
 

Stulaw

Member
THeD, can I ask you why you think that an iPad CPU which uses ARM architecture will be more powerful when the Wii U uses POWER architecture?

I understand what you're saying with the Squeenix engine not being stressful on the i7 should really say that it can run on the Wii U.

Also I'm wondering, does RISC Vs CISC affect game engines? I was wondering since PC's and the rumoured other next gen consoles will use x86 which is CISC, but the POWER architecture (as well as ARM, MIPS etc) is a RISC design.(note, I've got to learn these things later for University so I'll find out myself anyway).
 

THE:MILKMAN

Member
E3 2013 will reveal a lot and disappoint a great many people on this forum.

Your going to see for the first time, exclusive U games, with a decent budget, that were built using the final devkits, im predicting right now that the likes of Zelda / 3D Mario / Metroid U will look just as good, if not better than PS4 / 720 exclusive launch games.

The big Nintendo exclusives built from the ground up on final devkits will make NSMB U, Nintendo Land, Pikmin 3, ZombiU, W-101 and the Zelda HD tech demo look like 'last gen' games imo.

Of course at E3 2014 we will start to see PS4 / 720 flex their graphical muscles after developers have had enough time with final dev kits and not been rushed for launch / Xmas.

I agree Nintendo's own games will look great. But it is crazy talk to suggest that ND/GG/SSM to name three will show games at E3 that would only at best be equal in graphics.

That is a brave call.
 

Oblivion

Fetishing muscular manly men in skintight hosery
Hey, there are some new Nano Assault videos that were released this week:

http://www.youtube.com/watch?v=YkoHOGCc66E

Looks pretty good, right? My only problem is that the light emanating from the stars looks like bloom lighting rather than the more modern HDR.*

*I have no idea what I'm talking about.
 
E3 2013 will reveal a lot and disappoint a great many people on this forum.

Your going to see for the first time, exclusive U games, with a decent budget, that were built using the final devkits, im predicting right now that the likes of Zelda / 3D Mario / Metroid U will look just as good, if not better than PS4 / 720 exclusive launch games.

The big Nintendo exclusives built from the ground up on final devkits will make NSMB U, Nintendo Land, Pikmin 3, ZombiU, W-101 and the Zelda HD tech demo look like 'last gen' games imo.

Of course at E3 2014 we will start to see PS4 / 720 flex their graphical muscles after developers have had enough time with final dev kits and not been rushed for launch / Xmas.

I think you're being too optimistic.

I'm expecting the next wave of Wii U games, and the big internal titles from Nintendo dues for the next year or two, to look a lot better than what we've got so far, and for them to start demonstrating clear water between the best of Wii U and the best of the PS360 titles.

However, I expect to see Orbis/Durango titles showing up at launch - and in demo reels at E3 - that easily surpass what we've seen of Wii U games up to that point. Nintendo are going to be outclassed when it comes down to setting the best of Wii U against Orbis/Durango games next year, even if in some cases (rushed, cross-gen launch efforts) the difference isn't huge, and that gap is going to remain. What Nintendo need to do is to use the year headstart they have to build momentum and support, come out at E3 2013 with the best they have to offer in playable form and hope that whatever the internals of the MS/Sony next-gen systems turn out to be that they have done enough to ensure they won't be left out of the loop if publishers see there's money on the table.
 
I think you're being too optimistic.

I'm expecting the next wave of Wii U games, and the big internal titles from Nintendo dues for the next year or two, to look a lot better than what we've got so far, and for them to start demonstrating clear water between the best of Wii U and the best of the PS360 titles.

However, I expect to see Orbis/Durango titles showing up at launch - and in demo reels at E3 - that easily surpass what we've seen of Wii U games up to that point. Nintendo are going to be outclassed when it comes down to setting the best of Wii U against Orbis/Durango games next year, even if in some cases (rushed, cross-gen launch efforts) the difference isn't huge, and that gap is going to remain. What Nintendo need to do is to use the year headstart they have to build momentum and support, come out at E3 2013 with the best they have to offer in playable form and hope that whatever the internals of the MS/Sony next-gen systems turn out to be that they have done enough to ensure they won't be left out of the loop if publishers see there's money on the table.

Like the PS3 'gameplay videos' from E3 2005 ? ;).

The fact that the big casual franchises (CoD, WWE, UFC, AC, Fifa, Madden, Dance Cenral ect) will be released for PS360 / WiiU / PS4 / 720 will only further push the point home that there will not be a massive power leap at least at launch on PS4 / 720.

I also expect a massive round of announcements around Feb / March time about as yet unkown WiiU PS360 ports, i would guess Borderlands 2, Max Payne 3, Farcry 3, Splinter Cell Black List, Need for Speed, Tomb Raider, Bioshock, GTA V, MGS Ground Zeros and Watch Dogs.

These third party publishers want their games out on as many platforms that are capable of running them and WiiU certainly can.

Next gen might be a completely different story though, esp if those games fail to break 500k sales on the U.
 
wait...if I remember correctly, somebody posted that one of the guys who ports emulators to consoles said that Broadway clocked @729MHz is, in essence, only 20% slower than a single core of Xenon. Could it be that devs (who have never worked with Wii before) aren't using Espresso to it's fullest extent (hypothetically speaking, Espresso which is a modifyed Broadway running at ~170% faster clock speed and has two more cores should (and "should" is in italics for a reason) give Xenon a half-decent run for it's money)?
EDIT: Here is the quote:
Originally Posted by LibretroRetroArc:
I believe if you program only against one main CPU (like we do for pretty much most emus), you would find that the PS3/Xenon CPUs in practice are only about 20% faster than the Wii CPU.

I've ported the same code over to enough platforms by now to state this with confidence - the PS3 and 360 at 3.2GHz are only (at best - I would stress) 20% faster than the 729Mhz out-of-order Wii CPU without multithreading (and multithreading isn't a be-all end-all solution and isn't a 'one size fits all' magic wand either). That's pretty pathetic considering the vast differences in clock speed, the increase in L2/L1 cache and other things considered - even for in-order CPUs, they shouldn't be this abysmally slow and should be totally leaving the Wii in the dust by at least 50/70% difference - but they don't.

BTW - if you search around on some of the game development forums you can hear game developers talking amongst themselves about how crap the 360/PS3 CPUs were to begin with. They were crap from the very first minute the systems were launched - with MS hardware executives (according to some 360 'making of' book) allegedly freaking out when IBM told them they would be getting in-order CPUs for their new console - which caused them to place an order to have three 'cores' instead of one because one core would be totally pathetic (pretty much like the PS3 then where you only have one main processor and 6/7 highly specialized 'vector' SIMD CPUs that are very fast but also very low on individual RAM and essentially have to be able to do some heavy code weightlighting for you to gain anything). Without utilizing multithreading, you're essentially looking at the equivalent of Pentium 4-spec consoles that have to be helped along by lots of vector CPUs (SPUs) and/or reasonably mid-specced, highly programmable GPUs (which the Wii admittedly lacks)



To be fair though, game developers have learned to go multithreading heavy and games these days are all about fancy polygonal graphics models and post-processing shaders - stuff where lots of parallel SIMD processors and a fancy GPU are the main crux of the system - and the PS3/360 are capable there - so they've managed to work around the main CPU being so utterly weak. That and the fact that with HDTVs - most HDTVs cannot be guaranteed to have zero input lag so you don't have to worry about running your games at 60fps since you would cut out a large percentage of your potential audience because they wouldn't own a TV capable of handling the input to action response time to be able to play it properly - so what they do instead is run it at 30fps - which leaves enough wiggle room for even the worst HDTVs with lots of post-processing filters going on that slow down the response time..
 

Argyle

Member
wait...if I remember correctly, somebody posted that one of the guys who ports emulators to consoles said that Broadway clocked @729MHz is, in essence, only 20% slower than a single core of Xenon. Could it be that devs (who have never worked with Wii before) aren't using Espresso to it's fullest extent (hypothetically speaking, Espresso which is a modifyed Broadway running at ~170% faster clock speed and has two more cores should (and "should" is in italics for a reason) give Xenon a half-decent run for it's money)?

Single thread, not single core. Each Xenon core is hyperthreaded.
 

z0m3le

Banned
Gotcha. Even if they did have the slower kits back then, isn't it possible they had an indicator that things would be sped up? I'm sure he also knew the limited RAM bandwidth and all that. It's also worth noting that even with only 320 shaders at 400 Mhz, "Latte" would have a slight edge over Xenos.

320 R700 Shaders @ 40nm with R740's layout would only be ~80mm^2, considering GPU7 is twice as large, even with all the extra components and 32MB eDRAM, GPU7 "mario" "latte" is far too big to only house 320 shaders.
 

japtor

Member
THeD, can I ask you why you think that an iPad CPU which uses ARM architecture will be more powerful when the Wii U uses POWER architecture?

I understand what you're saying with the Squeenix engine not being stressful on the i7 should really say that it can run on the Wii U.

Also I'm wondering, does RISC Vs CISC affect game engines? I was wondering since PC's and the rumoured other next gen consoles will use x86 which is CISC, but the POWER architecture (as well as ARM, MIPS etc) is a RISC design.(note, I've got to learn these things later for University so I'll find out myself anyway).
You can't just dissolve everything down to a few terms. ARM vs POWER (or "Power" in this case?) doesn't mean anything without context. ARM chips are advancing fast as hell, like generational jumps in power every year or two, while the Wii U CPU appears to be similar to what has been used for quite a while now (barring mystery architectural improvements). ARM's stuff is catching up to Intel's lower end, which passed by the old PPCs a while back.

For the Squeenix demo, beyond the other stuff that's been mentioned (actual games using CPU for other stuff not used in demos)...the i7 used is an absolute monster, like quad core 3.9ghz and afaik destroys an old PPC at instructions per clock. Like for something that might max out an old PPC, that i7 would barely break a sweat at running the same thing.

And RISC vs CISC is moot these days iirc, can't really explain it myself but I remember reading that logically in the cores the AMD/Intel stuff is basically RISC now (and presumably the x64 extensions are) and do some magic to work with x86...either that or they're just flat out fast enough that it doesn't matter.
I could be wrong but I'm pretty sure the guy said core
Effectively the same thing in this case. If something is running as one thread then all the core can do is process that one thread. If you have more threads then...well it depends on the nature of the code. Like if the threads are just as complex as the single one, Xenos might still choke on it anyways. Or maybe it might be better, I'm thinking it'd create more bubbles, which hyperthreading is meant to fill up if possible (good!). Or it'd be an improvement but still leave a bunch of bubbles (better but still bad).
 

z0m3le

Banned
Hey, there are some new Nano Assault videos that were released this week:

http://www.youtube.com/watch?v=YkoHOGCc66E

Looks pretty good, right? My only problem is that the light emanating from the stars looks like bloom lighting rather than the more modern HDR.*

*I have no idea what I'm talking about.

http://www.youtube.com/watch?v=eqJ3CIuBzms&list=UU5McyOQhdgUqsq2oY0E2SjA&index=4&feature=plcp Coldblooder's running live gamer HD capture card and is uploading as high a quality as he can, so this might be better to look at, since the other video has a fair amount of artifacts, though it has absolutely no commentary.
 
Ok, so is this a basic overview of the Wii U's main processors:


CPU: IBM PowerPC 750-based tri-core processor "Espresso" @ 1.24 GHz

Co-Processor: Multi-core ARM processor "Starbuck*" @ 550MHz?

GPU: AMD Radeon HD based on R700 series "Latte/GPU7" @ 550 MHz

DSP: Currently Unknown @ 120/200 MHz



I don't think we have any new information on the ARM processor or the DSP yet.
 
Top Bottom