• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

tipoo

Banned
I'm voting for Durbango.


I'm not seeing anything particularly impressive - CPU wise - that would be impossible on Wii(which ironically, had a few pretty impressive physics based games)

Which games, may I ask? I recall some ones thats names elude me, but they were small room closed world games with those kind of physics, which is quite different from them being in a more open world with expansive numbers of objects.
 

ahm998

Member
Is english not your first language? If it is...
Anyway, the card HAS tessellation. Read the last few pages

Yeah , sorry for my bad English .

Arabic my first language, in my country all people never speak English

Can I get one sample on how architecture working?
 

tipoo

Banned
Some more details of minor interest:

core 1 is not normally active in wii mode (the "wii core" is core 0, and half of its L2 is disabled for wii mode)


https://twitter.com/marcan42

Core 0 and 2 as we knew have only half a megabyte of cache, and core 1 has 2. Only one core is active in Wii mode (as he said earlier it basically becomes a wii in wii mode).

I wonder if core 1 is any different apart from the larger cache.
 

tipoo

Banned
What's happening?

(Sorry need to catch up on what's going on)

Bunch of us chipped in to pay for Chipworks 200 dollar scans of the GPU, getting it tomorrow or the day after most likely. Possibly the CPU too later. That will let us count out the shaders etc.
 

OryoN

Member
I think he's combining the CU's from the GPU-side into the CPU figure?

I really don't think a 5-7x disadvantage is anything to scoff at anyway.

I thought he might be, but then I wondered why would he do this. How is comparing Orbis' GPU compute capabilities to Wii U's CPU alone even a fair comparison? Anyway...

I wouldn't say "scoffing"... just not overly excited about theoretical performance these days, as they never produce anywhere close the performance we expect. Still waiting for Cell to produce AI/Physics/etc significantly beyond what we've seen from the PS2 era. That jump was, not 5, not 7, not 10, but ~ 20X increase in theoretical performance! But that's also what makes the actual real-world results even more embarrassing. We could probably list a number of reason why it turned out(or just seems) that way, but it is what it is.

I think I'll stop here, cause we might be getting a bit off topic.
 

MysticX

Member
I love the WiiU, the only thing that bugs me is the SLOW OS!

bloody hell everything takes too long!

do we have a WiiU buddy list somewhere here on GAF or?
 

Meelow

Banned
Bunch of us chipped in to pay for Chipworks 200 dollar scans of the GPU, getting it tomorrow or the day after most likely. Possibly the CPU too later. That will let us count out the shaders etc.

So we will know how powerful the GPU is?

I love the WiiU, the only thing that bugs me is the SLOW OS!

bloody hell everything takes too long!

In my opinion the OS isn't as slow as everyone makes it out to be, but Iwata did confirm that for those people finding it slow it will be fixed.
 

Vaporak

Member
Didn't we already conclude that the difference is because of the windows driver? So it wouldn't matter as much on a dedicated console?

I've seen assertions that it's because of the software interface between DX and the AMD driver. I'm not an expert on the matter but that strikes me as less important that simply the changing nature of graphics algorithms. If it's the algorithms that are, mostly, dictating average hardware utilization then moving to an embedded software environment wouldn't really make up for that kind of a difference.

Could someone explain shaders to me ?.

Have heard them talked about for over a year regarding WiiU, the Wii was often referred to as not having enough of them and now Miyamoto is mentioning them.

What exactly are they, how are they used, are they hardware or software based, could you link a screen shot of a game with and without them.

Im guessing it's a technique used to make more realistic looking games as even without them the Wii had some incredible looking 'cartoony' looking games.

I could be way of the mark, i really have no idea im just interested to know what they or it is.

Can't find much on Google and i don't want to start a new thread when one of the experts in here could explain them in a single post :p.

Thanks.

No one can really give you the comparison you want because shader just means programmable graphics code. It is important because more than a decade ago graphics hardware was not programmable, you could olnly do a narrow range of fixed operations when programming.
 

OryoN

Member
I'm voting for Durbango.


Which games, may I ask? I recall some ones thats names elude me, but they were small room closed world games with those kind of physics, which is quite different from them being in a more open world with expansive numbers of objects.

Elebits is the game you probably have in mind.
http://www.youtube.com/watch?v=3xJXvFqhCk0

That argument is used all the time, but where are those big open world, heavy-physics games on PS360? Or where are the smaller indoor games with physics way beyond the level seen in Elebits? Goes both ways.
 

Durante

Member
tipoo, right GCN will be advantagous, but not anything like the link you gave above, it will be closer to 10%.

VLIW4 and VLIW5 in a console would be basically identical, and in DX they have ~20% difference, VLIW is simply very weak for high level coding because often 20-40% of the SPs go ignored, this wouldn't happen in a console.
I still don't buy this supposition that all shaders in all console games are written that close to the metal. Wouldn't the vast majority of developers outside of first party marquee titles still use HLSL (or whatever high level shader language the API offers)?

Where are these figures coming from?
I have no idea. The most likely figures put WiiU's CPU at 14 GFLOPs and The 8-core Jaguars at 102 -- for a factor of ~7.
 

Donnie

Member
So we will know how powerful the GPU is?

It will show us the exact number of shader units, ROPS and texture units. Combined with the clock speed we already know that will give us the GPU's theoretical performance.

Take no notice of any crazy person who tells you thats "guessing"..
 
I have no idea. The most likely figures put WiiU's CPU at 14 GFLOPs and The 8-core Jaguars at 102 -- for a factor of ~7.

That really puts things in perspective. For comparison purposes, my 2007-2008 era lapop based Core 2 Duo T9550 CPU cranks out 18 - 20 GFLOPs in Linpack 10.3.4.

The Wii U CPU really can't be that slow, even against severely outdated hardware. Can it?
 

Donnie

Member
That really puts things in perspective. For comparison purposes, my 2007-2008 era lapop based Core 2 Duo T9550 CPU cranks out 18 - 20 GFLOPs in Linpack 10.3.4.

The Wii U CPU really can't be that slow, even against severely outdated hardware. Can it?

You're making the mistake of taking theoretical flop numbers as some kind of true measure of overall CPU performance. The same kind of logic would suggest that XBox 360's CPU is about the same in performance as the Durango/Orbis CPU, or that 360's CPU is 7x as powerful as WiiU's, obviously not the case.
 

Meelow

Banned
It will show us the exact number of shader units, ROPS and texture units. Combined with the clock speed we already know that will give us the GPU's theoretical performance.

Take no notice of any crazy person who tells you thats "guessing"..

What exact number of shader units, ROPS and texture units would be good?
 

wsippel

Banned
That really puts things in perspective. For comparison purposes, my 2007-2008 era lapop based Core 2 Duo T9550 CPU cranks out 18 - 20 GFLOPs in Linpack 10.3.4.

The Wii U CPU really can't be that slow, even against severely outdated hardware. Can it?
Espresso has no real SMID unit, so raw floating point number crunching simply isn't its forte.
 
You're making the mistake of taking theoretical flop numbers as some kind of true measure of overall CPU performance. The same kind of logic would suggest that XBox 360's CPU is about the same in performance as the Durango/Orbis CPU, or that 360's CPU is 7x as powerful as WiiU's, obviously not the case.

Espresso has no real SMID unit, so raw floating point number crunching simply isn't its forte.

Thanks for the replies. I've been schooled today, I just learned that I've to make a distinction between Single FP performance and Double FP performance when it comes to FLOPs measurements. I recall the PS3 Cell was a SFP monster (which is why it excelled in applications like Folding@Home), but wasn't so good with DFP performance. Likewise with the 360.

Didn't realize that the Espresso didn't have a proper SIMD unit. I thought it was pretty much a given in most modern CPU architectures.
 

Donnie

Member
What exact number of shader units, ROPS and texture units would be good?

I'd say 320 shaders would be disappointing (still more powerful than 360 or PS3 but disapointing never the less). 400 would be adequate, while 480 would be very good IMO. That's the range I've been thinking for a while, so if I have to take a guess I suppose I'll go in the middle and say that the pic will show 400 shader units (but I'll hope for more :)).

As far as ROPS, I've been expecting 8 just like Xenos but I suppose we could see 12 to help deal with the extra fillrate needed for the screens controller. I'm not that concerned about it though considering 720p will be the standard resolution for most WiiU games IMO. Same goes for texture units, probably 16 to 20.
 
I'd say 320 shaders would be disappointing (still more powerful than 360 or PS3). 400 would be adequate, while 480 would be very good IMO. Anything more would be very impressive, especially considering power usage.

As far as ROPS, I was expecting 8, I doubt we'll see more than 12 and to be honest I'm not that concerned considering 720p will be the standard resolution for most WiiU games IMO. Same goes for texture units, probably 16 to 20.

Interesting you say 320 would be disappointing, as the Radeon 2900/3850/3870/4670 series only had 320 shaders and were decent cards for the price at the time of their release.

The 4670 in particular can still run any modern game fluently at 720p (which the Wii U is targetting) and has 8 ROPs/32TMUs. I should know, I've got one (the mobility variant @831/882 core/memory clocks) still functioning with the aforementioned Core 2 Duo t9550 processor.

Still runs games better than the 360/PS3 combo.
 

PetrCobra

Member
How these CPU's actually perform in real-world cases, should be interesting. Wii's CPU was at a great disadvantage in theoretical performance, yet, in actual games, I'm still asking myself; "what was all the noise about?" I'm not seeing anything particularly impressive - CPU wise - that would be impossible on Wii(which ironically, had a few pretty impressive physics based games)

Well, one thing is that the Wii is supposed to be quite constrained when it comes to number of moving AI objects on screen (enemies for instance) and Wii U could be in a similar position if we are to believe the early developer impressions (Warriors Orochi 3 developers for instance).

The example that pops up in my head instantly is Dead Rising and the stipped-down Wii port of that game. Much lower number of enemies, I'm told.
Then again, there was Sengoku Basara 3 which has a lot of enemies in it, and I'm not in the position to compare the AI in these games (though I guess the different environments and game styles would make the AI requirements somewhat, well... different). It would also be interesting for someone to compare that game with the aforementioned Warriors Orochi 3, which I didn't play (but it looks pretty similar to me) so I don't know what noticeable things might make that game so much more taxing (apart from graphics of course).
 

Meelow

Banned
I'd say 320 shaders would be disappointing (still more powerful than 360 or PS3 but disapointing never the less). 400 would be adequate, while 480 would be very good IMO. That's the range I've been thinking for a while, so if I have to take a guess I suppose I'll go in the middle and say that the pic will show 400 shader units (but I'll hope for more :)).

As far as ROPS, I've been expecting 8 just like Xenos but I suppose we could see 12 to help deal with the extra fillrate needed for the screens controller. I'm not that concerned about it though considering 720p will be the standard resolution for most WiiU games IMO. Same goes for texture units, probably 16 to 20.

How would that compare to the PS4/720 spec rumors?
 

Donnie

Member
Interesting you say 320 would be disappointing, as the Radeon 2900/3850/3870/4670 series only had 320 shaders and were decent cards for the price at the time of their release.

The 4670 in particular can still run any modern game fluently at 720p (which the Wii U is targetting) and has 8 ROPs/32TMUs. I should know, I've got one (the mobility variant @831/882 core/memory clocks) still functioning with the aforementioned Core 2 Duo t9550 processor.

Still runs games better than the 360/PS3 combo.

You also have to consider clock speed here. Because of the clock speed you have that Mobility 4670 running at its theoretical shader performance would be 532Gflops. WiiU's GPU is running at 550Mhz, so 320 shader units would give it only 352Gflops. 480 shader units would give WiiU's GPU the same theoretical shader performance as you're 4670.
 

Donnie

Member
shader units=stream processor? if it is reading that 480 is very good is incredibly disappointing

Well you have to look at it from the perspective of what's possible/probable given the size of the chip and the systems power usage. Also in comparison to other systems. 480 shaders would give developers quite a bit of head room to produce games very noticably ahead of anything possible on 360/PS3 (over twice the theoretical performance plus better efficiency from the newer design).
 

Donnie

Member
How would that compare to the PS4/720 spec rumors?

To give a decent answer to that isn't easy, and not something I want to attempt at 11pm :) I'll pop back in tomorrow and hopefully remember to give you my opinion on that. Though to be honest I think I've given it already quite a lot in this thread and others on this forum :D
 
According to Marcan, Wii U has the same ARM9 security processor as Wii (I suppose this explains the speed in which he was able to crack it). There is, however, an additional multiple (likely dual) core ARM on there as well handling video streaming.
 

ozfunghi

Member
shader units=stream processor? if it is reading that 480 is very good is incredibly disappointing

That would be over twice the raw performance of 360's GPU in combination with a more complete (modern) featureset. I don't know what you were expecting exactly from such a tiny console that consumes a measily 30W.
 

Meelow

Banned
To give a decent answer to that isn't easy, and not something I want to attempt at 11pm :) I'll pop back in tomorrow and hopefully remember to give you my opinion on that. Though to be honest I think I've given it already quite a lot in this thread and others on this forum :D

I'll PM you tomorrow to remind you lol.
 

AzaK

Member
According to Marcan, Wii U has the same ARM9 security processor as Wii (I suppose this explains the speed in which he was able to crack it). There is, however, an additional multiple (likely dual) core ARM on there as well handling video streaming.

Is this a tease based on recent developments in the secret lab or old info, I can't remember? :0
 

LeleSocho

Banned
Well you have to look at it from the perspective of what's possible/probable given the size of the chip and the systems power usage. Also in comparison to other systems. 480 shaders would give developers quite a bit of head room to produce games very noticably ahead of anything possible on 360/PS3 (over twice the theoretical performance plus better efficiency from the newer design).

So you can confirm that those are synonyms?
Yeah i know i have to confront it to the size and the power but in my head i'm still confronting to my ideal vision of the wiiu with a mid-high end 2008 (HD4850) or a mid end 2009 gpu (HD5770). Watching other companies fabricating millions and millions of devices at 32/28nm i always thought that nintendo would adopt one of those nodes so putting one the two gpus i've mentioned would've been possible even given the size and the power of the thing...

I really have to stop to think that the wiiu will be capable of doing something nice in terms of power :(
 
That would be over twice the raw performance of 360's GPU in combination with a more complete (modern) featureset. I don't know what you were expecting exactly from such a tiny console that consumes a measily 30W.

I've always been of the opinion that if WiiU is as powerful as the 360 then it would be fine, look at the games people have been able to get running on the 360, the likes of Halo 4, Forza Horizon, Gears 3 and all the fantastic looking multi platform games like BF3, Farcry 3, Skyrim and Crysis 3.

If WiiU ends up being 2x an Xbox 360 (1GB of Ram / 480 Gflop GPU) then i will over the moon with what Nintendo have done, esp as you say in such a low powered, tiny console.

First party games will look incredible and because of the artstyle will compete with the very best PS4/720 exclusives for the first few years imo.
 

ozfunghi

Member
but in my head i'm still confronting to my ideal vision of the wiiu with a mid-high end 2008 (HD4850) or a mid end 2009 gpu (HD5770).

That would put it in the same ballpark as Durango... 1Tflop DX10.1 - 1.36Tflop DX11
Seems like wishful thinking on your part.
 
According to Marcan, Wii U has the same ARM9 security processor as Wii (I suppose this explains the speed in which he was able to crack it). There is, however, an additional multiple (likely dual) core ARM on there as well handling video streaming.
Ok, thanks for the update. I did suspect that there was another ARM in the system besides the security one.
 

OryoN

Member
According to Marcan, Wii U has the same ARM9 security processor as Wii (I suppose this explains the speed in which he was able to crack it). There is, however, an additional multiple (likely dual) core ARM on there as well handling video streaming.

I wonder if this has anything to with in-game video streaming capability - as seen in some of NintendoLand's attractions - which one dev(Shien?) said had virtually no negative effect on performance when they enabled the feature.

So that's 1 ARM9 processor, 1 (likely) dual-core ARM, DSP, I/O processor... is all that on the GPU itself? What else could Nintendo/AMD possibly thrown there?
 

Meelow

Banned
3D Mario has always been a graphical showcase.

Super Mario 64 (1996) - Nintendo 64
super-mario-64-virtual-console-20070131013942559-1893530.jpg


Super Mario Sunshine (2002) - Nintendo GameCube
super_mario_sunshine.jpg


Super Mario Galaxy (2007) - Nintendo Wii (a console that's a little bit more powerful than GameCube
288503-super-mario-galaxy-wii-screenshot-one-of-the-many-worlds-to.png


Now imagine what Super Mario Universe could look like
 

LeleSocho

Banned
That would put it in the same ballpark as Durango... 1Tflop DX10.1 - 1.36Tflop DX11
Seems like wishful thinking on your part.

Only if you look at the flop/s numbers, a 7850/7870 performs waaaaaaaay better than those two cards... they have 50% more stream processors way higher bandwidth, texel rate and pixel rate...
Here's a benchmark you'll see that the difference is huge but at the same time it would've been a very noticeable jump to ps360.
 
I don't really know how Nintendo are going to better Mario Galaxy 2 running on Dolphin tbh, i think much of the improvements in the new 3D Mario game will come from much larger environments instead of the small planet levels from SMG.

Since we should stay on topic i would like to ask what kinds of features and effects will Nintendo be able to include in their first party games that we have never seen from them before ?.

Things the Wii was simply not capable off, i imagine much more detailed textures for a start since the 88MB's of Ram of the Wii to the 1GB of Ram in the WiiU is perhaps the biggest generational leap ever with regards to Ram ? but again i can't see Nintendo going overboard with textures ect in a very cartoony looking Mario game (one of the reasons i think they went with either a 0.5 or 1x leap over a 360).
 
Only if you look at the flop/s numbers, a 7850/7870 performs waaaaaaaay better than those two cards... they have 50% more stream processors way higher bandwidth, texel rate and pixel rate... look at some bechmarks
That was already answered. In a console, this difference in performance per flop won't be nearly as high as it is on PC, and performance per mm^2 would be bigger.

I don't know what to expect from WiiU GPU. If it's as customized as was pointed earlier by people who presumably had info about it, then FLOPs won't matter as much as with the other consoles because of the fixed functions included (and to emulate Wii at a hardware level, some fixed functions have to be there, so some effects will be "free" or much cheaper to achieve than on other consoles without that on it).
 

Meelow

Banned
Even then, it still looks freaking good on my 42 inch HDTV.

Mario Universe is probably gonna blow my brains out.

Yeah the picture I choice is low quality, the game looks really good on TV.

I think Mario Universe will look amazing, possibly compete with PS4/X720 games.
 

PetrCobra

Member
Can't wait for Mario Universe and Retro's game to shut up the naysayers.

Haha... they probably won't be up to the expectations of SOME PEOPLE but I can't say that I'll care. I'm sure the gameplay will blow me away nonetheless.

Still, I'm curious what they'll be able to pull off. Retro especially. The first time I played Metroid Prime (and MP3) I couldn't believe my eyes.
 
Top Bottom