• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
(baseless conjecture alert!)
I still think 320 ALUs makes the most sense, 160 with all that memory feels weird to me. I mean: why?

Hahahaha.

That's, literally, the BEST summation of this thread on the last few pages.

Please brace for those who still feel there's some Nintendo-exclusive techniques that will make the Wii U incredibly overpowered
There are other threads for hearsay and groupthink. The last few pages of this thread are an oasis of logic and reason in a sea of sh*t. Sh*t like this post I just quoted.

I don't want to read 6000+ posts so can someone please summarize what is known?
High level summary: Nintendo and AMD customized the hell out of this thing.
 

Donnie

Member
I think he means underpowered the similar to how the Wii was back in 2006.

The sheer power difference between Wii U and PS4/XBO will make ports really hard or downright impossible for the majority of devs. This isn't debatable.


Where's the connection between relative power and interest in chip design? Should we not be interested in Xbox Ones GPU or the PS4 GPU because they're so underpowered in comparison to GPUs that have been on the market for a while now in the PC space?

See there's really no logic to this idea. What's interesting for people here is a GPU that's unexplained. We know loads about the other consoles GPUs, what's the point in speculating something you already know?

But of course never the less we still have the same people coming into this thread with their console wars attitudes. My advice to Heavy and the likes is if you arent interested in the topic at hand then feel free to not be here, you aren't contributing anything worthwhile.
 

JordanN

Banned
You seem to know something everyone else doesn't. The rest of us don't know the "pure calculating power" of the Wii U.

Do you really think Wii U is closer to PS4/XBO than 360/PS3 in power? I think you'll be extremely disappointed if so.

We already know its thermal and watt limits.
 

krizzx

Junior Member
(baseless conjecture alert!)
I still think 320 ALUs makes the most sense, 160 with all that memory feels weird to me. I mean: why?


There are other threads for hearsay and groupthink. The last few pages of this thread are an oasis of logic and reason in a sea of sh*t. Sh*t like this post I just quoted.



High level summary: Nintendo and AMD customized the hell out of this thing.
Do you ever actually contribute to this thread?

I didn't think so.


Thank you. I thought I was alone in noticing this. Its like there is this small group who come in here for not other reason than bash the console and impede progress. Its like the thought of someone hating the Wii U causing them physical harm. Its so disgusting.

Where's the connection between relative power and interest in chip design? Should we not be interested in Xbox Ones GPU or the PS4 GPU because they're so underpowered in comparison to GPUs that have been on the market for a while now in the PC space?

See there's really no logic to this idea. What's interesting for people here is a GPU that's unexplained. We know loads about the other consoles GPUs, what's the point in speculating something you already know?

But of course never the less we still have the same people coming into this thread with their console wars attitudes. My advice to Heavy and the likes is if you arent interested in the topic at hand then feel free to not be here, you aren't contributing anything worthwhile.

That is the key. They aren't using any. There purpose for being here isn't progressive in the slightest.
 

Donnie

Member
Do you really think Wii U is closer to PS4/XBO than 360/PS3 in power? I think you'll be extremely disappointing if so.


Can't you have this console wars discussion elsewhere? This thread isn't for this kind of thing. Nobody cares that WiiU isn't as powerful as Xbox One or PS4. Nobody's interested in any ill conceived multiples you or anyone else has come up with. Especially ill conceived considering even Xbox One and PS4 aren't exactly close in power themselves. This thread is kind of a hobby for some, if you have a problem with that then I don't know what to tell you.
 
Can't you have this console wars discussion elsewhere? This thread isn't for this kind of thing. Nobody cares that WiiU isn't as powerful as Xbox One or PS4 or any ridiculously ill conceived multiples you or anyone else has come up with (especially ill conceived considering even Xbox One and PS4 aren't exactly close in power themselves).

praise the sweet billy jebus.
 

JordanN

Banned
Can't you have this console wars discussion elsewhere? This thread isn't for this kind of thing. Nobody cares that WiiU isn't as powerful as Xbox One or PS4 or any ridiculously ill conceived multiples you or anyone else has come up with (especially ill conceived considering even Xbox One and PS4 aren't exactly close in power themselves).

If you read this page, you would know I had to bring up PS4/XBO. It's only console wars if you make it to be (I'm not doing it however).

And the XBO/PS4 are close enough to follow with my downport example.
 
I'm not saying I know Wii U's exact power but I know what the more likely answer is.

If Wii U isn't closer to PS4/XBO, knowing Wii U's power would be null for the exact reasons downporting would be hard.

All I'm saying is that there's the people who are actively exploring the mysteries of the Wii U GPU, and there's the people who seem to already have the answer. If there's a reason why you think you have the "more likely" answer, there's plenty of people more well versed in hardware than I am who would be willing to discuss it with you. That is, if your reasons are sound.
 

wsippel

Banned
You gotta ask Microsoft/Sony but I think it's pretty clear it's pure calculating power.
Neither system is ten times more powerful than its predecessor. And a lot of what those systems have over their predecessors is apparently wasted on useless OS features most users will never, ever touch.
 

Earendil

Member
Like a lot of others, I gave up.

Too many people are entirely hostile and passionate about this thing, and perhaps unrealistically so.

Every Wii U thread gets derailed by petty console wars BS on both sides and there's no point in even trying to have meaningful discussion. Most of us in this thread don't care how powerful or not powerful the system is, we just want to try and figure out what this chip does and how it works, not how it compares in power to unreleased consoles.

There's a million other threads around here to ridicule the Wii U's lack of power (which isn't even the point of this thread), so please find another one and have at it.
 

JordanN

Banned
Neither system is ten times more powerful than its predecessor. And a lot of what those systems have over their predecessors is apparently wasted on useless OS features most users will never, ever touch.
I think you should tell Microsoft that*.

*Actual figure was 8x although PS4 may be closer to 10.

Edit: Didn't we also have Lherre say the same thing?
 

prag16

Banned
From the outside looking in (I know nothing about this) I'm wondering why even bother when we know it's underpowered. The Wii U is just slightly more powerful than the 360 and PS3. Why waste any significant time at all trying to analyze its GPU? It's like writing a thesis for your PhD on a Transformers film in the sense that you've wrote this incredible, detailed, well-researched paper for an incredibly crappy film.

Hahahaha.

That's, literally, the BEST summation of this thread on the last few pages.

Please brace for those who still feel there's some Nintendo-exclusive techniques that will make the Wii U incredibly overpowered

Be that as it may (pretty insulting though), it's a better use of time than baiting and attacking Nintendo fans in every applicable topic.
 

joesiv

Member
From the outside looking in (I know nothing about this) I'm wondering why even bother when we know it's underpowered. The Wii U is just slightly more powerful than the 360 and PS3. Why waste any significant time at all trying to analyze its GPU? It's like writing a thesis for your PhD on a Transformers film in the sense that you've wrote this incredible, detailed, well-researched paper for an incredibly crappy film.
I think it's more like the Wii U is like the girl next door, while she's no super model, she's unique and mysterious, and for some reason it's fun trying to figure her out.

Some chase after the super models, but in the end often they are the least interesting, and have little to explore since they're famous and been interviewed to death.

You go chase after your A list celebrity consoles, and leave us to our down and out one lol.
 

JordanN

Banned
Ignoring the fact that not a single spec is 10x that of the WiiU...yeah, looks like we still need this thread as we keep getting ignorant made up posts like this

I don't think 10x is literally "10x everything" but it all adds up.

The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.

If you can show me Wii U/PS3/360 doing that I'll surely retract such statements.
 

bomblord

Banned
I don't think 10x is literally "10x everything" but it all adds up.

The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.

From knowledge of basic algebra I can tell you that 10x then it is literally 10x everything.

If that is not what 10x means then please tell me how you are measuring 10x.
 
People just don´t feed the trolls.

Posts that bring nothing to the conversation need no attention at all.

... Gets back to reading this thread
 

pestul

Member
I don't think 10x is literally "10x everything" but it all adds up.

The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.

If you can show me Wii U/PS3/360 doing that I'll surely retract such statements.

If XB1 and PS4 demand 1080p from the devs, Wii U would sit quite nicely with 720p downspec ports imo (if it were to get many). With regards to gpu feature-set it is of course much closer to those systems than Wii ever was to 360/PS3. Well, it wasn't actually even close and it still managed to get ports.
 

JordanN

Banned
From knowledge of basic algebra I can tell you that 10x then it is literally 10x everything.

If that is not what 10x means then please tell me how you are measuring 10x.
Again, I can't answer this. You'll have to contact Microsoft/Sony for a definite answer of what 10x means (or someone with more knowledge fill me in).


If XB1 and PS4 demand 1080p from the devs, Wii U would sit quite nicely with 720p downspec ports imo (if it were to get many). With regards to gpu feature-set it is of course much closer to those systems than Wii ever was to 360/PS3. Well, it wasn't actually even close and it still managed to get ports.
I don't think Wii's situation got any better when it was 480p. I'm not seeing it change much.
 

Schnozberry

Member
I don't think 10x is literally "10x everything" but it all adds up.

The power to do 1080p, advance physics processing, post processing, the power to process larger textures, a strong tessellation unit etc all at once.

If you can show me Wii U/PS3/360 doing that I'll surely retract such statements.

Adds up to what? This thread isn't about proving the Wii U's superiority to other platforms. We're just trying to figure out exactly what to expect from the GPU, and the constant digressions into console wars malarkey are unhelpful.
 

prag16

Banned
Neither system is ten times more powerful than its predecessor. And a lot of what those systems have over their predecessors is apparently wasted on useless OS features most users will never, ever touch.
It's almost comical how some people can't or refuse to grasp the fact that this nothing like a Wii vs. PS360 situation in terms of capabilities.

The raw power gap is not quite as large (easily more than an order of magnitude vs easily less than an order of magnitude) and the feature set is gap is closer still. Architecture and design philosophy are not as divergent either.

Wii U could still miss out on most ports due to business reasons, but that's not the point of this thread. This small handful of posters that get off on attacking anybody who doesn't talk down the Wii U as much as they do are the ones wasting their time, not the group investigating this mystery GPU.
 

z0m3le

Banned
If you read this page, you would know I had to bring up PS4/XBO. It's only console wars if you make it to be (I'm not doing it however).

And the XBO/PS4 are close enough to follow with my downport example.

XB1 and PS4 are pretty far apart. PS4 has a GPU with ~66% more processing power. It has twice the usable RAM and while they have the same CPU, PS4's can have a lot of freed up processing cycles thanks to the extra grunt of it's GPU to do GPGPU calculations. Even if you just looked at the flops from the GPUs to get a "clear" picture of next gen, you have XB1 coming in @ either ~3 or ~6 times Wii U's GPU (considering transistor count and wattage, leaning to 3x) PS4 is ~5 or ~10 times Wii U's GPU, This is closer to your generational leap I guess but because multiports will deal with XB1, they can't do anything that greatly out strips the Wii U.

No one should expect a long life for Wii U, it will likely be replaced in 2017 or 2018, but 360 and PS3 likely will receive console ports until sometime in 2016, that really muddles the waters especially when you consider how mobile is catching up. It's all a bit stupid and clumsy this gen we are heading into has shown that every platform launched so far has stumbled, Wii U has yet to pick up too. I'm just rambling now so I'll end with a thought to tie this all together, Wii U is as close to next gen as third parties want it to be, hardware has very little to do with it, and it probably would of taken 3 or 4 TFLOPs to bury Wii U from a performance stand point, at least to make games un-portable to Wii U with the same code, however at the moment it is certainly unprofitable for third parties to spend many resources to bring any multiplatform to the system unless they are called activision or ubisoft who've been able to do all their AAA multiplats @ under 2 Million USD, which makes selling ~60k units profitable.
 

JordanN

Banned
Adds up to what? This thread isn't about proving the Wii U's superiority to other platforms. We're just trying to figure out exactly what to expect from the GPU, and the constant digressions into console wars malarkey are unhelpful.
Ironically, it's posts like these that I think "damaged" this thread more than me or some other users did. I think it would be nice to actually follow the conversation before pointing fingers or hastily joining in.
 

bomblord

Banned
It's almost comical how some people can't or refuse to grasp the fact that this nothing like a Wii vs. PS360 situation in terms of capabilities.

The raw power gap is not quite as large (easily more than an order of magnitude vs easily less than an order of magnitude) and the feature set is gap is closer still. Architecture and design philosophy are not as divergent either.

Wii U could still miss out on most ports due to business reasons, but that's not the point of this thread. This small handful of posters that get off on attacking anybody who doesn't talk down the Wii U as much as they do are the ones wasting their time, not the group investigating this mystery GPU.

There should be no gap in features only in raw computational power. Everything we've heard from dev's points to DX10.1/11 equivelant features.
 

Schnozberry

Member
Ironically, it's posts like these that I think "damaged" this thread more than me or some other users did. I think it would be nice to actually follow the conversation before pointing fingers or hastily joining in.

I guess I must have been imagining all the gaps between relevant posts filled with hardware spec dick swinging.
 

Donnie

Member
If you read this page, you would know I had to bring up PS4/XBO. It's only console wars if you make it to be (I'm not doing it however).

And the XBO/PS4 are close enough to follow with my downport example.


I've been reading, don't think there was any need to continue the nonsense that began this little detour from the topic at hand. Let alone bringing up the whole 10x idea which is no more a fact than the sky being purple.

Now you're trying to justify this ''fact'' by talking about some arbitrary level of performance that apparently denotes a factor of 10 (1080p, processing large textures ect. I don't mean this as an insult but you don't know what your talking about. If this was another thread I'd happily explain why it makes no sense at all on numerous levels, but that's not what this threads for. Can we now get back to the discussion at hand please?
 
Yep, that's correct.

I may as well throw this out there rather than stew on it myself...

I wonder if Block P isn't UVD with blocks Q being whatever it is joesiv labeled blocks "H" and "G" on Llano.

Perhaps this might even make sense of the block labeled "TAVC" on Brazos. It's right next to the UVD and could be a consolidated block like many of the others . Might be an "Accelerated Video Codec" or something like that and basically be 2x Latte's Q blocks.

Where that puts shader export, I don't know. Maybe Block "I" now that I'm going with bgassassin in having Block D as the UTDP...

Edit: Damn, it's like the Shin'en thread spilled into here. :\
 

prag16

Banned
activision or ubisoft who've been able to do all their AAA multiplats @ under 2 Million USD, which makes selling ~60k units profitable.
Do you have a source for that? I think Ubisoft made a comment at one point to the effect of a €1 million cost for porting. It would be very interesting to know this. If the porting costs were under $2 million, that means flops like FIFA may have even been (barely) profitable, and games like CoD, AC3, and Batman, while not bringing forth windfalls, were solidly in the black. Unless ghosts, AC4, and Arkham origins sell worse than their predecessors, I don't see these companies dropping all support. If they do, it makes it that much harder to take advantage if the system does end up with a solid installed base down the line.
 

JordanN

Banned
I've been reading, don't think there was any need to continue the nonsense that began this little detour from the topic at hand. Let alone bringing up the whole 10x idea which is no more a fact than the sky being purple.


Now you're trying to justify this ''fact'' by talking about some arbitrary level of performance that apparently denotes a factor of 10 (1080p, processing large textures ect). I'm not saying this to insult you but you don't know what your talking about.
Hey, I'm just saying PS4/XBO represents big leaps over their predecessor and the 10x alludes to that. I never said I knew for sure what 10x meant and you're within your power to take it up with MS/Sony personally. Don't shoot the messenger I guess.

I brought it up because I wanted to defend myself. If someone never replied, the conversion would have already been over.
 

ozfunghi

Member
Hey, I'm just saying PS4/XBO represents big leaps over their predecessor and the 10x alludes to that. I never said I knew for sure what 10x meant and you're within your power to take it up with MS/Sony personally. Don't shoot the messenger I guess.

I brought it up because I wanted to defend myself. If someone never replied, the conversion would have already been over.

Ah. A new approach. "Just don't respond to my bullshit and i'll go away!"
 

pestul

Member
I don't think Wii's situation got any better when it was 480p. I'm not seeing it change much.
With it not being even close to the same situation?

I see it's not really worth getting into this in a thread specifically about Wii U's gpu.

EDIT: Apparently you have to get the last word in.
 

wsippel

Banned
I may as well throw this out there rather than stew on it myself...

I wonder if Block P isn't UVD with blocks Q being whatever it is joesiv labeled blocks "H" and "G" on Llano.

Perhaps this might even make sense of the block labeled "TAVC" on Brazos. It's right next to the UVD and could be a consolidated block like many of the others . Might be an "Accelerated Video Codec" or something like that and basically be 2x Latte's Q blocks.

Where that puts shader export, I don't know. Maybe Block "I" now that I'm going with bgassassin in having Block D as the UTDP...

Edit: Damn, it's like the Shin'en thread spilled into here. :\
Hard to tell. What's your take on block D by the way? I'd say D looks like some sort of (de)compressor - large SRAM pool for dictionaries, some dual port SRAM as work memory. Because (de)compressors apparently have both of those. But D doesn't look like any block in any known AMD GPU design as far as I can tell.
 

z0m3le

Banned
Do you have a source for that? I think Ubisoft made a comment at one point to the effect of a €1 million cost for porting. It would be very interesting to know this. If the porting costs were under $2 million, that means flops like FIFA may have even been (barely) profitable, and games like CoD, AC3, and Batman, while not bringing forth windfalls, were solidly in the black. Unless ghosts, AC4, and Arkham origins sell worse than their predecessors, I don't see these companies dropping all support. If they do, it makes it that much harder to take advantage if the system does end up with a solid installed base down the line.

http://www.gamesindustry.biz/articl...says-wii-u-ports-costing-under-USD1-3-million ~$1.2 Million USD for ubisoft which is ~40k copies with a $30 profit for Ubisoft per game.
 
Hard to tell. What's your take on block D by the way? I'd say D looks like some sort of (de)compressor - large SRAM pool for dictionaries, some dual port SRAM as work memory. Because (de)compressors apparently have both of those. But D doesn't look like any block in any known AMD GPU design as far as I can tell.

I think that it is probably the UTDP at this point. I've always seen the two distinct SRAM pools along with it's general vicinity next to CPU interface and what I see as hardware interpolators as being indicative of an instruction cache and constant cache. I originally thought the UTDP was an adjacent block and D was just the caches. Now, after going back and forth w/ bg and seeing Brazos, I think it's all-inclusive.

What kind of decompressor were you thinking exactly? Video signal?

I think I might be onto something with the UVD thing down in P though. Still racking my brain on what TAVC could stand for in Brazos. "Target Video Codec?"
 

JordanN

Banned
This caught my attention (and I guess it's GPU related).

and it probably would of taken 3 or 4 TFLOPs to bury Wii U from a performance stand point
When I'm hearing zero mention of UE4 or Agni's running on Wii U (the tech involved, not the engine) I'm not sure why anyone would believe this.

I think if Wii U really was this "PS2 of next gen" I'm sure many here would like to believe, why oh why is it years after the console was announced all this next gen stuff is showing up on their competitors and not Wii U?

I think 1 or 1.7Tflops is enough to bury Wii U since it's quite evident devs don't care enough to use Wii U for all these fantastic demos or games.

z0m3le said:
nothing you've just said has to do with tech
And whoosh. Oh well, I guess I'm done now.
 
This caught my attention (and I guess it's GPU related).


When I'm hearing zero mention of UE4 or Agni's running on Wii U (the tech involved, not the engine) I'm not sure why anyone would believe this.

I think if Wii U really was this "PS2 of next gen" I'm sure many here would like to believe, why oh why is it years after the console was announced all this next gen stuff is showing up on their competitors and not Wii U?

I think 1 or 1.7Tflops is enough to bury Wii U since it's quite evident devs don't care enough to use Wii U for all these fantastic demos or games.

Are you still here? Read the title.
 

z0m3le

Banned
JordanN, nothing you've just said has to do with tech, even the PS4 has trouble doing the UE4 demo. Won't have the next gen lighting under UE4 as well...
 

prag16

Banned
This caught my attention (and I guess it's GPU related).


When I'm hearing zero mention of UE4 or Agni's running on Wii U (the tech involved, not the engine) I'm not sure why anyone would believe this.

I think if Wii U really was this "PS2 of next gen" I'm sure many here would like to believe, why oh why is it years after the console was announced all this next gen stuff is showing up on their competitors and not Wii U?

I think 1 or 1.7Tflops is enough to bury Wii U since it's quite evident devs don't care enough to use Wii U for all these fantastic demos or games.
Please. The Wii U could have been packing a teraflop and it still would miss out on the vast majority of the games it's currently missing out on. And the multiplats would probably still be half assed right now.
 
Does anyone have better versions of the R770, or a Brazos die shot? The R770 shot in the OP is pretty small, probably wouldn't be worth tweaking, and I didn't see a Brazos shot. Bigger the better, uncompressed ideal!
Nice work. These are the RV770 and Brazos pics I've been relying on. That was the best RV770 I found.

RV770

Brazos

I've been using Photoshop for 18 years, so I could certainly help with any needed "art".

He finally came around. :)

Alright, here's what I see as going on around the perimeter of RV770. It's a bit hard to tell where some blocks end and other begin, but this should more or less give some idea of where I'm coming from. I've labeled what must be the L2 cache blocks. The other blocks I've outlined are either memory controllers or ROPs. No way for me to say which is which.

http://u.cubeupload.com/Fourth_Storm/rv770annotated.jpg

Hmm, actually if I were to take a guess, I would say that the highlighted block all the way to the top right (with it's small portion of brighter SRAM), would be a memory controller. Descriptions of the layout have memory controllers paired w/ the L2 and ROPs. Obviously the actual die is a bit messier, but if we are to go on that, then the block to the left of that separated block of L2 on top should be a memory controller. I don't think the ROPs and L2s interact with each other, so there would be no need for them to be in close proximity.

Edit: I've revised the annotation to reflect these thoughts.

Thank you good sir. I was hoping for something like this. I'll try to see what you are seeing.

Q is the display controller and analagous to Latte's F block. Both appear near the display phy and have 32 of those small SRAM banks.

I kept questioning my previous labeling of this block in Llano and this makes more sense to me.

I may as well throw this out there rather than stew on it myself...

I wonder if Block P isn't UVD with blocks Q being whatever it is joesiv labeled blocks "H" and "G" on Llano.

Perhaps this might even make sense of the block labeled "TAVC" on Brazos. It's right next to the UVD and could be a consolidated block like many of the others . Might be an "Accelerated Video Codec" or something like that and basically be 2x Latte's Q blocks.

Where that puts shader export, I don't know. Maybe Block "I" now that I'm going with bgassassin in having Block D as the UTDP...

Edit: Damn, it's like the Shin'en thread spilled into here. :\

I believe your original placement of the SE being P is correct. It looks too similar to the one in Brazos for me to change right now.

Even though there is a fab size difference, the Q blocks are much, much larger and have more memory than the two Llano blocks you are comparing them to.

To be more specific on my previous view, I think F is the display controller and E is the UVD.

Hard to tell. What's your take on block D by the way? I'd say D looks like some sort of (de)compressor - large SRAM pool for dictionaries, some dual port SRAM as work memory. Because (de)compressors apparently have both of those. But D doesn't look like any block in any known AMD GPU design as far as I can tell.

Fourth has already mentioned this, but I believe D is the Ultra Threaded Dispath Processor. It looks too much like the one in Brazos. But like Fourth I'd like to hear more.
 
Status
Not open for further replies.
Top Bottom