Wii U GPU base specs: 160 ALUs, 8 TMUs, 8 ROPs; Rumor: Wii U hardware was downgraded

Status
Not open for further replies.
I don't see huge gains coming. Engines which maxed out the 7th gen will already be close to maxing the Wii U. It has double the RAM and so far from what I can see the ability to generate prettier lighting...and that's it. Sure, put in some real budget into a game and picture GOW 3 but with better textures and lighting. Hey that right there looks nice. But by the time such a game appears the Naughty Dog proper shows what they have up their sleeve.
And by the time Naughty Dog is doing that some PC centric developer will be showing off purely realtime SVOGI running at 60fps.

Console gamers are willing to sacrifice visual quality anyway, It just seems to be by degrees with some. WiiU games won't be any uglier than PS4 games. They'll just being using muted in comparison tech to achieve their beauty.

I mean we're in an era where even handhelds are achieving stuff like Revelations and Uncharted.

Ugly and beauty are only degrees today. Bigger emphasis on cohesion of assets will be more important going forward.
 
I think one thing that can be said for the Wii U is that the games at the end of the gen will look better than the launch games by a larger amount than the other two consoles. There's more room to improve there due to the complex architecture and lack of documentation that was rumoured when launch titles were being developed.
If anyone is willing to dump big money + rare talent into a big project, sure. I doubt even Nintendo wants to do that though.
 
Probably a stupid question, but would Nintendo provide these specs to developers?

Or is your source just a developer who's looked at the die shots and come up with the same conclusion as people in the GPU thread?
Haha. I don't think it's stupid. But the source of the info only provided what was shown in the SDK. The numbers they give devs are in different terminology than what I said in the OP, but are pretty easy to translate on their own. With the die shot and past discussion along with it there's no denying its a 160-ALU part, but like I said the numbers given tell you that without the die shot.
 
I'm fine with the WiiU not being a powerhouse. What bothers me is how is it possible that they sell it 350 euros and take a loss?!? The gamepad is not bleeding high end tech for sure and don't tell me the streaming tech went through 4 years of research and development...
Is it the custom chip? So, where is the interest?
I am no techie so I really don't understand these choices...
I think one aspect that is overlooked is the exchange rate to western currencies. A strong Yen actually hurts exports.
 
Given realities of the WiiU ecosystem I don't think this is much of a problem anymore.

It's firmly a "Nintendo" system. You buy one if 3D World or Zelda tickle your fancy. It was likely never to be a 3rd party system. Even with a substantive capability upgrade. Publishers have had an aversion to funding projects on Nintendo hardware for generations. Nintendo in the past two have just given them firm reasons.

Nintendo design imperatives are antithetical to the modern industry.
If they'd actually put more into being 3rd party friendly, they'd likely be able to sell more units, since that would be less people needing a second machine.

And with the WiiU already being more expensive than the Wii to begin with, the "get it for Nintendo only" angle won't serve any purpose.
 

Lonely1

Unconfirmed Member
Haha. I don't think it's stupid. But the source of the info only provided what was shown in the SDK. The numbers they give devs are in different terminology than what I said in the OP, but are pretty easy to translate on their own. With the die shot and past discussion along with it there's no denying its a 160-ALU part, but like I said the numbers given tell you that without the die shot.
I envy you, I would like to have a look to such numbers. :(
 
PS4 is quite a bit more powerful than X1 and yet Ryse looks amazing. Wii U can't even blow the doors off ports from older machines, while these new systems are coming out of the gate with some pretty impressive looking games that blow it away in terms of technical ability. Looking closely at specs one can readily assume it has only started. Same cannot be said for the Wii U.
Lone case... But Need for Speed says hello...
It can be done if the devs bother
 
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
Great news right there. People like numbers and comparing console parts to PC parts, but in a closed custom built environment numbers don´t tell the whole story, Wii U has yet to be used to its full potential, I hope this guys and Shinen show what the Wii U can do.
 

KingSnake

The Birthday Skeleton
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
Nice little piece of info.
 
Interesting in hearing what those in the now have to saw about that quote. That's the most direct I've seen any developer talk about the WiiU. This thread should get interesting once again with these revelations. I'll be sure to pop in a bit later after I'm done with my paper. Nice find PatientX
 
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
If the dev states it's 192 then it's 192. Maybe post to shin'en to see if they will confirm it. Finally people can stop beating the specultive 160 drum that they took as fact. Can you get a link if that's allowed?
 
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
Hmmm...192 ALUs. That explains what we thought was too large a space taken up by them on the die to some extent. But they're still too big to be a 192 ALU part though, aren't they..? I seem to remember someone in the Latte thread saying that they were 90% too large to be 160 ALUs. So perhaps the rest of the space taken up by logic is taken up by fixed functions..? That's only 32 ALUs difference and the space is almost twice the size needed.

Unless I've got that 90% figure wrong..? Have I remembered that correctly..?

And we still have only one theory, from wsippel if I'm remembering right, about why the TMUs:ROPs ratio is 1:1. But then if that theory about 8 x TEV Units is right then what's taking up the extra logic in the ALUs..?
 
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
We'll have to actually wait and see the game running in motion. Someone on GAF claimed that Straight Right did some further optimization on the Wii U version of Deus Ex: HR DC given the extra dev time/delay.
 
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
Very interesting. That was very direct. 192 shader units is a weird number to come with.

As BG stated, the actual shader count was not stated in any documentation. It was calculated from the other information giving.
 
Great news right there. People like numbers and comparing console parts to PC parts, but in a closed custom built environment numbers don´t tell the whole story, Wii U has yet to be used to its full potential, I hope this guys and Shinen show what the Wii U can do.
Comparing console parts and PC parts is perfectly appropriate, closed box or not. A 7850 in a console doesn't magically get the same performance as a 7950 or even a 7870 just because it's in a closed box. That said, the Wii U is a different animal altogether and doesn't really use any conventional CPU/GPU combination which is what makes comparisons next to impossible.
 
OK, I don't know if this is already known but here is a forum post from the official Project CARS forum. It is from "PC Render Coder" of the project and there was a talk about WiiU. He replied :



Considering this is an official game dev, what do you think about his remarks on the console's performance ?
Cool stuff. I'm eager to see where Wii U goes technically, but still kinda miffed at how it all turned out.
 

KingSnake

The Birthday Skeleton
Maybe we can have now an updated title?

If we consider both info we can say that actually Wii U was upgraded from 160 to 192 shader units meanwhile. Must have been the summer update. Lol.
 
Need for Speed says hello? I'm sorry, what resolution was it at again? What aspects were improved? A bit of lighting and textures and otherwise its the 7th gen version?
Well they started on the Wii U version in November/December 2012 right and released the game in Mars/April 2013?
Pretty cool what they could pull off in that short time, now imagine if they had the same development time as the other consoles.. JUMMY!
 
Well they started on the Wii U version in November/December 2012 right and released the game in Mars/April 2013?
Pretty cool what they could pull off in that short time, now imagine if they had the same development time as the other consoles.. JUMMY!
Exactly people forget we are getting the bare minimum effort, despite some devs wanting to make a difference and implement cool things, NO Wii U 3rd party game can be called a Big Project
 
192 shaders would be very very odd. What's that? VLIW6? lol. The guy must have gotten confused. I'm interested in how he came to that number, though.

Just remember, bg's info has passed mod inspection. IMO, that makes it solid. An outright statement from someone who should be under an NDA (if they are in a position to know), seems very unlikely.
 
192 shaders would be very very odd. What's that? VLIW6? lol. The guy must have gotten confused. I'm interested in how he came to that number, though.

Just remember, bg's info has passed mod inspection. IMO, that makes it solid. An outright statement from someone who should be under an NDA (if they are in a position to know), seems very unlikely.
That's custom for you LOL. If you are a member over there go ask him if he'd shed details on that number.
 
192 shaders would be very very odd. What's that? VLIW6? lol. The guy must have gotten confused. I'm interested in how he came to that number, though.

Just remember, bg's info has passed mod inspection. IMO, that makes it solid. An outright statement from someone who should be under an NDA (if they are in a position to know), seems very unlikely.
EatChildren said BG's data was plausible by cross referencing the SDK with the die shot, but did not directly list the numbers. It's not impossible. Some of the embedded Richland AMD parts are 192 sharers, but are VLIW4.
 
I think one thing that can be said for the Wii U is that the games at the end of the gen will look better than the launch games by a larger amount than the other two consoles. There's more room to improve there due to the complex architecture and lack of documentation that was rumoured when launch titles were being developed.
The docs are still lacking today. Well, if you'd like to compare them to certain other western-centric companies. You are only partially correct. The Xbox one might show a similar level of improvement from start to finish that the Wii U will, as both are more esoteric. The PS4 is much easier to "max out" in the sense its referred to generally on this forum.
 
192 shaders would be very very odd. What's that? VLIW6? lol.
I haven't been keeping up with the Wii U's hardware lately, but that does seem odd. From what I remember, VLIW5 used groups of 80 ALUs. I can't remember what they used to call those groups, but I guess it's sort of similar to GCN's compute units (which are 64 ALUs). Feel free to correct me on this.
 
that's a lot of shaders.

I had a search and people on the net (don't know if there info is right or not) seems to say the xbox360 has 48 and the ps3 56.
Those can't be compared because there was huge architecture shift from gpu used in x360 to 2xxx and onward AMD gpus.

Radeon 2900 with 320 sp was around twice as powerfull as pc version of x360 gpu.

Also nvidia counted them diffrently too before 4xx series
 
EatChildren said BG's data was plausible by cross referencing the SDK with the die shot, but did not directly list the numbers. It's not impossible. Some of the embedded Richland AMD parts are 192 sharers, but are VLIW4.
I don't think it would be VLIW4 (48 ALU x 4?). Then again, 192 is not a multiple of 5 so it can't be pure VLIW5..
192 shaders would be very very odd. What's that? VLIW6? lol. The guy must have gotten confused. I'm interested in how he came to that number, though.

Just remember, bg's info has passed mod inspection. IMO, that makes it solid. An outright statement from someone who should be under an NDA (if they are in a position to know), seems very unlikely.
It could be possible that those numbers were outdated, but that would be a major change.

that's a lot of shaders.

I had a search and people on the net (don't know if there info is right or not) seems to say the xbox360 has 48 and the ps3 56.
Xbox 360 used unified shaders, so you have to multiply that number by 5, (48 x 5 = 240 shader units.) Those units were not as efficient nor modern as the ones in latte, though. As for the PS3, the GPU had an older architecture, it I can't remember how you calculate them.
 
Posted it on the other thread, but I feel it's relevant here as well.


There's only a few AMD designs using 192 shader units, and ironically their introduction happened the same year the Wii U launched.

They are:

Radeon HD 7540D/7520G/7400G (Trinity IGP's)
Released: October, May, September 2012 (APU: A6-5400K/A6-4400M/A6-4355M)
Codename: Scrapper
Based on: HD 69xx's VLIW4 ISA
192 Shader Units
12 Texture Mapping Units
4 Raster Output Processors

Memory Bus: DDR3 Memory Controller (DDR3-1866/1600/1333)

292 GFlops @ 760 MHz (HD 7540D)
190 GFlops @ 496 MHz (HD 7520G)
125 GFlops @ 327 MHz (HD 7400G)

211 GFlops @ 550 MHz (Wii U GPU clock)


Take it as you will, but seeing how the timeline fits in perhaps it's not so farfetched to believe that perhaps they have something in common other than the already obvious DDR3 bus (VLIW4, or some structure/architecture decisions, perhaps). If only we could see a GPU die shot from those. Still we've looked before into IGP similarities here and we found blocks to be more akin to them before than R600/R700 parts... I think?
I think A6 APUs are 192.

Don't see how it gels with the die shot thought, You'd think there would have to be some 3-way self similarity.
Beaten to the punch it seems.
 
Sounds like he's mixing up threads and shaders (192 threads).

Since it's at this point, the docs list Latte as having 32 ALUs and is a VLIW5 achitecture.
Are these the secret docs that we can't see? I thought it sounded like these "docs" are not what you showed EatChildren either, as there were no hard numbers for ALUs therein? Or am I mixing that up with something else?

Can we really dismiss a seemingly concrete statement made by an actual developer heavily working with the heavily so easily?
 
Which docs, and when were they dated?
Compute shaders hadn't been enabled yet with the version I saw, but the 192 threads is a part of that.

Are these the secret docs that we can't see? I thought it sounded like these "docs" are not what you showed EatChildren either, as there were no hard numbers for ALUs therein? Or am I mixing that up with something else?

Can we really dismiss a seemingly concrete statement made by an actual developer heavily working with the heavily so easily?
I'm not dismissing it. The SDK says there are 192 threads (which I checked on after being informed about that post) and 32 ALUs. I gave EC a summarized version if you must know. The 32 ALUs and VLIW5 was a part of the summary.

32 x 5 = 160

I get the feeling you are looking at my posts with the same face as your avatar, haha.
 
Compute shaders hadn't been enabled yet with the version I saw, but the 192 threads is a part of that.



I'm not dismissing it. The SDK says there are 192 threads (which I checked on after being informed about that post) and 32 ALUs. I gave EC a summarized version if you must know. The 32 ALUs and VLIW5 was a part of the summary.

32 x 5 = 160

I get the feeling you are looking at my posts with the same face as your avatar, haha.
So, you're saying it is 160 shaders units and 192 threads? Where did the 192 threads count came from? How does that work?
 
Compute shaders hadn't been enabled yet with the version I saw, but the 192 threads is a part of that.



I'm not dismissing it. The SDK says there are 192 threads (which I checked on after being informed about that post) and 32 ALUs. I gave EC a summarized version if you must know. The 32 ALUs and VLIW5 was a part of the summary.

32 x 5 = 160

I get the feeling you are looking at my posts with the same face as your avatar, haha.
Wouldn't threads have to scale linearly with the shader core count? i Don't really see how one shader core can be handing more than one thread at a time, or alternately why only a few of them would be handling more than one thread at a time.
 
Status
Not open for further replies.