• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

krizzx

Junior Member
You left out a key point made.

Proper tools.

As a near launch title open world game, even though they did have a small window where the current tools are available, you expect that the game should have no frame rate issues or pop in? I haven't played the game myself, but I have heard people say that the last parts of the game seem far more polished and have fewer issues than the earlier portions.

The performance of that game and many others also increased after the big spring update.

I've played the game from the beginning since then and I have no encountered a single instance of slowdown. So far, I'm at chapter 9.
 
That demo is all running on the PC. No word if all the tessellation and real time lighting will make it to the Wii U. The PS3/360 would not be able to run the game at those detail settings.

Precursor Games devs said that there will be no substantial difference between PC and Wii U versions. And the Wii U is capable of both tessellation and real time lighting so...
 
You left out a key point made.

Proper tools.

As a near launch title open world game, even though they did have a small window where the current tools are available, you expect that the game should have no frame rate issues or pop in? I haven't played the game myself, but I have heard people say that the last parts of the game seem far more polished and have fewer issues than the earlier portions.

It's a lot better looking than any other Lego game, and none of the recent ones boast flawless framerates or snappy load times. It's not a super-advanced engine, but I think Undercover, huge open world and all, sees it looking its best.

For what it's worth, it was also only TT Fusion's third HD game, the others being the PS3 port of LotR: Aragorn's Quest and Lego Rock Band. They're a handheld developer who delivered what I believe is, warts and all, the most impressive Lego game to date.
 
You left out a key point made.

Proper tools.

As a near launch title open world game, even though they did have a small window where the current tools are available, you expect that the game should have no frame rate issues or pop in? I haven't played the game myself, but I have heard people say that the last parts of the game seem far more polished and have fewer issues than the earlier portions.

And it was in developpement since 2010
http://www.youtube.com/watch?v=lOn1FX8kNd4 00:38
 

prag16

Banned
The performance of that game and many others also increased after the big spring update.

I've played the game from the beginning since then and I have no encountered a single instance of slowdown. So far, I'm at chapter 9.
The other thing to keep in mind is that the engine for these LEGO games has been on PS360 for years and obviously not Wii U. So in terms of this being a "ground up" Wii U game... well not exactly.
 

Hermii

Member
I guess we will get a step closer to find out how it compares to 360 after the next Nintendo Direct in a couple of weeks.
 

krizzx

Junior Member
The other thing to keep in mind is that the engine for these LEGO games has been on PS360 for years and obviously not Wii U. So in terms of this being a "ground up" Wii U game... well not exactly.

Indeed. The same thing happened with the Wii because devs kept making games with PS2 engines.

The games were completely devoid of bloom, dynamic lighting, volumetric lighting, bump mapping, normal mapping, high level physics, and all of the other things the Wii was capable of that the PS2 was not. People instantly jumped to conclude that it was the result of the Wii's inability and not the devs lack of effort or unfamiliarity with the hardware just as they are now.

http://2.bp.blogspot.com/-gd9mAj2y14M/T9soicVnTxI/AAAAAAAABNc/2rRBuceTpZ4/s1600/Red+Steel.jpg
http://image.gamespotcdn.net/gamespot/images/2010/332/991817_20101129_screen004.jpg
http://images1.wikia.nocookie.net/_...arkside-chronicles-20090428064148735_640w.jpg
http://sd-engineering.nl/nmcube/quake04.jpg
http://nsider2.com/wp-content/uploads/2010/08/Jett-Rocket-jet-pack.jpg

You could show them a thousand examples like those, and they would still insist the hardware was only on par with the PS2...

That's why I've stopped trying to convince people that the Wii U hardware is stronger. If they don't want it to be, then they will not acknowledge that it is. They will always claim that the lowest end of what it has shown is its maximum capability.

I guess we will get a step closer to find out how it compares to 360 after the next Nintendo Direct in a couple of weeks.

The next Nintendo direct will tell us nothing of the sort. All it will show us is what has been done so far, not what can be done. It takes years of development to get the full potential out of a console. The Wii U hasn't even been out a year and people are expecting Uncharted 3 and Halo 4 level games on it immediatley.

The stronger a console is, the longer it takes to develop a proper game for it without the use of premade game engines, which the Wii U is severely lacking for the moment.
 
The games were completely devoid of bloom, dynamic lighting, volumetric lighting, bump mapping, normal mapping, high level physics, and all of the other things the Wii was capable of that the PS2 was not.

I hope the rest of you read stuff like this and chuckle, then sigh, which is what those of us with first hand experience shipping titles on these platforms do when we read stuff like this.
 

jaz013

Banned
The games were completely devoid of bloom, dynamic lighting, volumetric lighting, bump mapping, normal mapping, high level physics, and all of the other things the Wii was capable of that the PS2 was not. People instantly jumped to conclude that it was the result of the Wii's inability and not the devs lack of effort or unfamiliarity with the hardware just as they are now.

Having seen an area full of these effects using Unity3D, and running directly on the Wii hardware, I just keep scratching my head because I couldn't understand why so many games where so bare bones when it was actually very straight forward to implement most of these.

One example on the WiiU I can think of is Injustice, it just looks like the developers took the least effort and where just happy it didn't crashed. The gameplay is replicated on the gamepad and there isn't even an option to customized the pad display. An apparently, Capcom took about a week of development to properly port their MT Framework engine (the "stuttering" reported by a few looks, shockingly, like the results of a poorly coded engine I was working once, as a school project, and the results of poor coding in Unity3D).
 

krizzx

Junior Member
What game is this?

The Conduit 2. It was a shame that High Voltage Software didn't have as much creative talent as they had programming skills. They made a great game engine but they weren't very artistic, their story writing skills were subpar and their artistic design needed a little work.

I often find myself looking at this and wondering what could have been. http://www.youtube.com/watch?v=4QFkevmDfyk They should have tried to license the game engine to other devs.

Its good that the Wii U has Cryengine 3 at least. I honestly think its the best of the next gen game engines. Even better than UE4. I wonder if Crytek will answer questions. I want to ask them if they implemented tessellation into their Wii U version of Cryengine 3.
 
Can the feature for texture compression that Toki Tori 2 dev mentioned help in reduce the disparity in relation to RAM from the PS4/XBone, or that is a common feature in GPUs?
Just curious!

The Gamepad receives a compressed image stream from the Wii U trough special compression hardware, perhaps it could be used for gaming textures to?
 

M3d10n

Member
Texture compression is a common feature for GPUs, but going by the WiiWare titles released last gen Nintendo have some crazy compression algorithms going on to squeeze a game like Jett Rocket into just 40MB. So yes, imo, Nintendo's texture compression should help in reducing the difference somewhat. How much though I wouldn't like to hazard a guess.

One other thing about the RAM that could be interesting is what Ancel said a while back about the Wii U having 'almost unlimited' RAM. I'm not sure about anyone else but to me that sounded like he was talking about something other than the difference between having 512MB and 1GB to play with.

Shin'en used procedural textures for their Wii games. Instead of storing per-pixel images, their textures are created out from several combinations of vector data, noise, smaller re-used image patterns and color/filter operations at load time.

You can see a very extreme example of procedural texture (and models, particles, sounds and nearly everything) in this 96KB demo: .kkrieger
(since it's 9 years old already, you can watch it on youtube if it doesn't run on your computer).

The Gamepad receives a compressed image stream from the Wii U trough special compression hardware, perhaps it could be used for gaming textures to?

It's not "special compression hardware" at all. It's a vanilla h.264 video stream sent via 5GHz Wifi (they reverse engineered the thing already).
 

krizzx

Junior Member
Shin'en used procedural textures for their Wii games. Instead of storing per-pixel images, their textures are created out from several combinations of vector data, noise, smaller re-used image patterns and color/filter operations at load time.

You can see a very extreme example of procedural texture (and models, particles, sounds and nearly everything) in this 96KB demo: .kkrieger
(since it's 9 years old already, you can watch it on youtube if it doesn't run on your computer).

http://jettrocket.wordpress.com/

I don't recall them using that in Jett Rocket. They detailed the entire development process for the game.

The entire area was their at load time. You can see everything on the stage from one side to the from the beginning. There is nothing procedural about that. The same with the art of balance and and Fun Fun Minigolf. I could see them doing that for Fast Racing League, but I don't think they did it there either.
 

fred

Member
The Conduit 2. It was a shame that High Voltage Software didn't have as much creative talent as they had programming skills.

I often find myself looking at this and wondering what could have been. http://www.youtube.com/watch?v=4QFkevmDfyk The should have tried to license the game engine to other devs.

Its good that the Wii U has Cryengine 3 at least. I honestly think its the best of the next gen game engines. Even better than UE4.

Yup, I think we're going to see Epic and Unreal having less of a stranglehold on the industry this gen, I know a couple of people that have had problems with support from Epic when working on UE3. They're supposed to be a right pain in the arse to work with...although that may just be those two people that don't have a lot of good to say about them of course lol.

You'll see CryEngine and especially Unity gaining more traction this gen I think.

I've always wondered why they didn't licence Quantum3 myself...maybe the tools weren't up to snuff, perhaps..? Certainly would have earnt them a few quid. :eek:/
 
A

A More Normal Bird

Unconfirmed Member
No one would have read into it much if the developer who spoke of it wasn't excited about it, and intentionally bringing it up to make a point. So there's every reason to.
There's every reason to what? Assume that because Ancel's team sometimes forgot to compress their textures and the game didn't subsequently crash that Nintendo has some kind of revolutionary tech? Of course, it wouldn't be 100:1 texture compression tech because they were using uncompressed textures. How about just assuming that the extra RAM in the Wii-U allowed it to handle the larger size of the uncompressed textures, like I said in my post. Maybe it's even enough for the entire game to utilise them and they're only compressing them for digital distribution reasons, who knows?

Look, it doesn't take a lot to handle a few uncompressed textures. A number of years ago, with no real clue of what I was doing, I made a mod for Oblivion using uncompressed 2k textures. The PC I had at the time had 1GB of DDR RAM and 256MB of GDDR3 on the graphics card. The game hitched for a handful of frames whenever it first loaded the textures (equipping the items) but other than that performance was unaffected.

Precursor Games devs said that there will be no substantial difference between PC and Wii U versions. And the Wii U is capable of both tessellation and real time lighting so...

Just a quibble, but real time lighting and/or GI isn't something that you just have hardware support for like tessellation, it's basically a matter of grunt. We haven't seen a Wii-U title with real time GI like Cryengine offers, so it will be interesting to see if it's present in the first game on the console to use the engine. Considering the scope of the environments shown so far (and in the first ED) it could be more likely.
 
Yup, I think we're going to see Epic and Unreal having less of a stranglehold on the industry this gen, I know a couple of people that have had problems with support from Epic when working on UE3. They're supposed to be a right pain in the arse to work with...although that may just be those two people that don't have a lot of good to say about them of course lol.

You'll see CryEngine and especially Unity gaining more traction this gen I think.

I've always wondered why they didn't licence Quantum3 myself...maybe the tools weren't up to snuff, perhaps..? Certainly would have earnt them a few quid. :eek:/

I think that they had intended to majorly profit from the engine themselves first, but unfortunately it didn't work out that way. Conduit was a bit overhyped, and it got slammed by no name reviewers, and hate sites, even though it was a solid 7 type of game.

Then they went for broke with Conduit 2, fixing mistakes, and putting the game people really wanted on hold (The Grinder). But by the time Conduit 2 released, even Nintendo was abandoning the Wii.

I REALLY hope they tweak their engine well, updating it for Wii U, release the Grinder (The fps, not that weird top down 3rd person thing they were messing with), then allow others to license their engine.

They've been focusing on mobile and tablet games for a while now, and games for licensed properties. That's how they built up the company strength to take the Conduit risk in the first place, so I hope it means they'll do it again, rather than stay back in obscurity. I Really like those guys.
 

Chronos24

Member
As unlikely as it is, if Nintendo/ NERD found a way to make wavelets work for texture compression, we'd be looking at ~100:1 instead of 4-6:1. And it might actually work with a Wii U like memory architecture: Read a wavelet compressed texture from MEM2, decompress/ transcode to S3TC, write to MEM1, texture from there.

If I remember an interview with Iwata he said something about how memory centric the console is so there really might be some truth to all of this.
 

fred

Member
They won't use the Quantum3 engine for consoles with traditional programmable shaders I don't think, they have an unprecedented lifetime licence for the Infernal engine so they'll probably use that instead. I'd love to see The Grinder, the last thing I saw them working on was Conduit for the 3DS.

I really liked The Conduit, particularly the online multiplayer until hackers ruined it. The Bounty Hunter game was excellent, I'm surprised nobody has ripped it off yet tbh. A very good original idea.
 

Donnie

Member
That Lego City game is a Wii U exclusive published by Nintendo. Designed from the ground up for the Wii U, its not a port. But from what I read in reviews it suffers from slow down issues and serious pop up. It's not a technically demanding game by any standards it shouldn't be taxing modern hardware at all.

There's exclusive games on 360 and PS3 that don't look technically demanding and yet suffer from slow down or graphical issues, same as any other system.

The idea that any hardware should handle a certain level of visuals with ease regardless doesn't really ring true.
 

wsippel

Banned
If I remember an interview with Iwata he said something about how memory centric the console is so there really might be some truth to all of this.
I don't think it's feasible. They'd probably need some actual magicians to pull something like this off - on the other hand, that's apparently what NERD is all about: Making the impossible happen. Then again, not even Nvidia and AMD could make it work, and I'm sure they've been working on it for many years, because it would be a huge deal. Who knows.
 

fred

Member
I think a fair few people are going to be surprised by the eye candy that the 3D Mario and Retro's new game are going to display. Mind you, some regular posters in this thread are still going to deny that both games are going to be beyond anything the PS3 and 360 are capable and that the Wii U is only 'on par' with the previous gen consoles.
 

Donnie

Member
If Wii Us MCM is 1.2B trans, that's one of the worst engineering jobs ever considering it has yet to prove it outpowers a 360 which weighed in at ~500m (same for PS3).

I'll take your word for it, after all you're a massive name in 3D hardware tech, while obviously the likes of Nintendo, IBM and AMD are mere amateurs.
 

Kai Dracon

Writing a dinosaur space opera symphony
As far as Lego City goes, I will say Traveler's Tales is not known for their technical ability. Their Lego engine never seemed to be very efficient, and the PC ports of some of the series could require a perversely beefy PC to run smoothly. (I've read Lego Batman 2 was one such culprit.)

They did not seem to optimize their engine much if at all for the U, especially considering the terrible load times it suffers from.
 

tipoo

Banned
If I remember an interview with Iwata he said something about how memory centric the console is so there really might be some truth to all of this.

This is why how little Nintendo details these things is infuriating, that comment could mean the cache setup which we already know about (with some questions like wii parts use in wii u mode remaining), or it could mean something new and crazy like that they developed on their own (which, I mean, you'd think Nvidia and AMD would get to first).

I would think though, that if not from Nintendo, then from some developer leaks or comments we would have heard something on a gamechanging technology. We did hear that stuff can be compressed, but stuff can be compressed on any modern GPU architecture, we don't know if this is something beyond that.
 
I'd love to see The Grinder, the last thing I saw them working on was Conduit for the 3DS.

Last I heard The Grinder is dead and buried. I wouldn't expect the game we all saw in that initial trailer to ever see the light of day. It was eventually turned into a top down view kind of game, and then killed off.
 
Actually they had the top down game they were going to release for PS360, and the originaly FPS Wii version was still going to release for Wii. They never followed through, though I would imagine that they still have all of those assets.
 
That Lego City game is a Wii U exclusive published by Nintendo. Designed from the ground up for the Wii U, its not a port. But from what I read in reviews it suffers from slow down issues and serious pop up. It's not a technically demanding game by any standards it shouldn't be taxing modern hardware at all.

Lego City video games was first mentioned in 2009. So it was most likely not exclusive from day 1. (Source: http://lego.wikia.com/wiki/LEGO_City_Undercover )

Also the engine is the same as Lego Batman 2. (Yes i played that game aswell and hence why theres a Lego Batman 2 in Wii U now...) The engine is just poop for open world.In order to take advantage of Wii U (or other consoles) they propably need a new engine. Their current one is good enough for their non open world games, but it just isnt suitable for true open world games.
 
Actually they had the top down game they were going to release for PS360, and the originaly FPS Wii version was still going to release for Wii. They never followed through, though I would imagine that they still have all of those assets.

The Wii version changed to the top down version as well and then was internally killed off.
 
Hey folks. I have been in communication with our friends blu and bgassassin and was made aware of a link which details Brazos to some extent. I figured I'd share with you some discoveries concerning the Brazos die and what I find to be its relationship to Latte. The following is an official annotation of the Brazos made by AMD:
Bobcatlayout.png
You can find the link here although the rest of it is in Russian. :p

After racking our brains and checking against some of the Radeon documentation available online, we can confidently identify the blocks most pertinent to our discussion. Some of the acronyms may be slightly off or incomplete, but the important thing is we have a very good idea of the blocks' functions. Going by the letters after the underscores we have:
TC=Texture Cache
TD=Texture Data
PAV=Primitive Assembly (Vertex?)
TAVC=Texture Address (Value Control? Vertex Coordinates?)
SPI_SC=Shader Processor Interpolation/Scan Converter (rasterizer)
LDS=Local Data Share
SQ=Sequencer (Ultra-threaded dispatch processor)
SX=Shader Export
CP=Command Processor
DB=Depth Buffer (ROP)
CB=Color Buffer (ROP)

Some very interesting notes come out of this.

Brazos' 4 ROPs are split into two asymmetric blocks - one for color and the other for Z/Stencil or depth. They are also located conveniently next to the DDR3 interface - just as they are on the RV770. Hey, they also look alot like the W blocks on Latte! I know I wasn't the only one to think this. Yes, the block labelled TAVC also appears similar to the W blocks, however I believe we can chalk this one up as merely a deceptive resemblance, as neither the location nor the actual amount of SRAM banks are a true match. I will defer back to the RV770 and its attested relationship to Latte in concluding that Latte's ROPs are more similar in design to the blocks in that chip. Forget about exact SRAM layout and whatnot, because as we see on Latte, even two of the same blocks on one chip can have the SRAM arranged in different ways. Thus, I would still say that Latte has 2 identical ROP blocks (so 8 ROPs total).

Most of the setup engine (minus the rasterizer) seems to lie within the block labeled PAV. I hate to say, "I told you so," but there you go. The tesselator is definitely in this block as it is shared by both the vertex and geometry setup units. After more reading, it is also clear that "Hierarchical Z" is a function of the rasterizer which checks against the depth buffer (ROP), and is not its own block. I believe the PAV block on Brazos to be a match for block G on Latte, which I have labeled as the Vertex Setup in the past. Geometry setup may be a separate block on Latte or it may not. At this point, it doesn't really matter. We know it's there.

I don't really see this, but bgassassin found some similarities between the SQ block on Brazos and Block D on Latte. I believe Block D contains the Instruction Cache and Constant Cache, both of which feed the thread dispatcher. I still think the actual UTDP is Block I, as it appears similar to something a beyond3D member labeled as thread dispatch on the Tahiti die a while back. Again, we don't really need to know for sure; the point is that we can now identify the "front end" of Latte. The Scan Converter/Rasterizer will be nearby as well. Additionally, it is quite interesting to note that even though DirectX11 hardware supposedly does away with fixed function interpolation hardware, there appears to still be something leftover of it in the spi-sc block on Brazos. Perhaps now it does not seem so ridiculous for the DirectX10.1-born Latte to feature full blown interpolation units (J Blocks).

Block C appears very similar to CP on Brazos. This is exactly where I have always expected it to be, as the command processor receives instruction from the CPU and having it adjacent to the CPU interface just makes sense. As you can see on Brazos, the command processor has a direct channel to the CPU cores.

I would still have Shader Export as Latte's Block P, as it is the only block that really qualifies. The large and somewhat messy assortment of SRAM attests to the various buffers a SX necessitates.

Finally, I come back to the subject of the TMUs and L1 texture cache. What I previously identified in the Llano die has once again been confirmed with the Brazos annotation. I have even used my skills at MS Paint to make it crystal clear for you all. :)

On Brazos, we see the same basic SRAM configuration which I previously used to link the L1s in Llano to the S Blocks in Latte. We have the 2 longer rectangular banks + a much smaller squarish bank (In Brazos, the long banks are strangely staggered, with one slightly longer as well. This is a slight variation and hardly enough to discount the similarities). Right next to these, we have the 16 other small SRAM banks. I have speculated that these might be the cache tags. Whether they are exactly this or not, it is pretty obvious that they are cache related. I bet that if we were to get a more clear picture of the RV770 die, we would see these 16 banks in the TMUs, as they are on Latte. These variations exist across the different architectures. In the Llano photo linked to in the OP, for example, those 16 banks are on the right side of the L1 blocks while the other aforementioned banks are on the left. The bottom line is that this SRAM configuration is indicative of being texture cache and TMU related. The common number of banks points directly to there being 8 TMUs on Latte, just as on Brazos.

I hope this analysis has been helpful to some. It is my opinion that from the TMU count alone we can extrapolate that Latte contains 160 shaders. Anything more would be unbalanced and nonsensical. Once we get past the hangup that the SPUs and the TMUs are not in direct contact with one another, it becomes quite easy to see that Wii U features two fairly standard SIMD engines/cores. We must give up any pretense of a dual setup engine, and it should be remembered that the only GPUs to necessitate such a configuration were absolute behemoths with many times the shading capability of even high estimates for Latte.

As for the shaders themselves, I never claimed to have all the answers and I can't name exactly why they are so much larger than would seem to be necessary for 160 40nm SPUs. I've thrown out a few guesses in previous posts and I am sure that there have been a few modifications/shortcuts made especially for the Wii U hardware. Wsippel linked to an interesting article on thread interleaving - could be, but who knows? We could speculate forever (and that might be fun), but there's nothing else we can really say about them without actual Wii U developer insight. At this point, I don't know if I have much else to say about the Wii U GPU. I have attempted to analyze to the best of my abilities, without bias. People can believe or disbelieve as they please, just as I am sure that new game footage at E3 will have various folks claiming that the Wii U is both more and less capable than PS360. Ok, I think I'm going to go play some games now. :)
 

krizzx

Junior Member
Hey folks. I have been in communication with our friends blu and bgassassin and was made aware of a link which details Brazos to some extent. I figured I'd share with you some discoveries concerning the Brazos die and what I find to be its relationship to Latte. The following is an official annotation of the Brazos made by AMD:
You can find the link here although the rest of it is in Russian. :p

After racking our brains and checking against some of the Radeon documentation available online, we can confidently identify the blocks most pertinent to our discussion. Some of the acronyms may be slightly off or incomplete, but the important thing is we have a very good idea of the blocks' functions. Going by the letters after the underscores we have:
TC=Texture Cache
TD=Texture Data
PAV=Primitive Assembly (Vertex?)
TAVC=Texture Address (Value Control? Vertex Coordinates?)
SPI_SC=Shader Processor Interpolation/Scan Converter (rasterizer)
LDS=Local Data Share
SQ=Sequencer (Ultra-threaded dispatch processor)
SX=Shader Export
CP=Command Processor
DB=Depth Buffer (ROP)
CB=Color Buffer (ROP)

Some very interesting notes come out of this.

Brazos' 4 ROPs are split into two asymmetric blocks - one for color and the other for Z/Stencil or depth. They are also located conveniently next to the DDR3 interface - just as they are on the RV770. Hey, they also look alot like the W blocks on Latte! I know I wasn't the only one to think this. Yes, the block labelled TAVC also appears similar to the W blocks, however I believe we can chalk this one up as merely a deceptive resemblance, as neither the location nor the actual amount of SRAM banks are a true match. I will defer back to the RV770 and its attested relationship to Latte in concluding that Latte's ROPs are more similar in design to the blocks in that chip. Forget about exact SRAM layout and whatnot, because as we see on Latte, even two of the same blocks on one chip can have the SRAM arranged in different ways. Thus, I would still say that Latte has 2 identical ROP blocks (so 8 ROPs total).

Most of the setup engine (minus the rasterizer) seems to lie within the block labeled PAV. I hate to say, "I told you so," but there you go. The tesselator is definitely in this block as it is shared by both the vertex and geometry setup units. After more reading, it is also clear that "Hierarchical Z" is a function of the rasterizer which checks against the depth buffer (ROP), and is not its own block. I believe the PAV block on Brazos to be a match for block G on Latte, which I have labeled as the Vertex Setup in the past. Geometry setup may be a separate block on Latte or it may not. At this point, it doesn't really matter. We know it's there.

I don't really see this, but bgassassin found some similarities between the SQ block on Brazos and Block D on Latte. I believe Block D contains the Instruction Cache and Constant Cache, both of which feed the thread dispatcher. I still think the actual UTDP is Block I, as it appears similar to something a beyond3D member labeled as thread dispatch on the Tahiti die a while back. Again, we don't really need to know for sure; the point is that we can now identify the "front end" of Latte. The Scan Converter/Rasterizer will be nearby as well. Additionally, it is quite interesting to note that even though DirectX11 hardware supposedly does away with fixed function interpolation hardware, there appears to still be something leftover of it in the spi-sc block on Brazos. Perhaps now it does not seem so ridiculous for the DirectX10.1-born Latte to feature full blown interpolation units (J Blocks).

Block C appears very similar to CP on Brazos. This is exactly where I have always expected it to be, as the command processor receives instruction from the CPU and having it adjacent to the CPU interface just makes sense. As you can see on Brazos, the command processor has a direct channel to the CPU cores.

I would still have Shader Export as Latte's Block P, as it is the only block that really qualifies. The large and somewhat messy assortment of SRAM attests to the various buffers a SX necessitates.

Finally, I come back to the subject of the TMUs and L1 texture cache. What I previously identified in the Llano die has once again been confirmed with the Brazos annotation. I have even used my skills at MS Paint to make it crystal clear for you all. :)


On Brazos, we see the same basic SRAM configuration which I previously used to link the L1s in Llano to the S Blocks in Latte. We have the 2 longer rectangular banks + a much smaller squarish bank (In Brazos, the long banks are strangely staggered, with one slightly longer as well. This is a slight variation and hardly enough to discount the similarities). Right next to these, we have the 16 other small SRAM banks. I have speculated that these might be the cache tags. Whether they are exactly this or not, it is pretty obvious that they are cache related. I bet that if we were to get a more clear picture of the RV770 die, we would see these 16 banks in the TMUs, as they are on Latte. These variations exist across the different architectures. In the Llano photo linked to in the OP, for example, those 16 banks are on the right side of the L1 blocks while the other aforementioned banks are on the left. The bottom line is that this SRAM configuration is indicative of being texture cache and TMU related. The common number of banks points directly to there being 8 TMUs on Latte, just as on Brazos.

I hope this analysis has been helpful to some. It is my opinion that from the TMU count alone we can extrapolate that Latte contains 160 shaders. Anything more would be unbalanced and nonsensical. Once we get past the hangup that the SPUs and the TMUs are not in direct contact with one another, it becomes quite easy to see that Wii U features two fairly standard SIMD engines/cores. We must give up any pretense of a dual setup engine, and it should be remembered that the only GPUs to necessitate such a configuration were absolute behemoths with many times the shading capability of even high estimates for Latte.

As for the shaders themselves, I never claimed to have all the answers and I can't name exactly why they are so much larger than would seem to be necessary for 160 40nm SPUs. I've thrown out a few guesses in previous posts and I am sure that there have been a few modifications/shortcuts made especially for the Wii U hardware. Wsippel linked to an interesting article on thread interleaving - could be, but who knows? We could speculate forever (and that might be fun), but there's nothing else we can really say about them without actual Wii U developer insight. At this point, I don't know if I have much else to say about the Wii U GPU. I have attempted to analyze to the best of my abilities, without bias. People can believe or disbelieve as they please, just as I am sure that new game footage at E3 will have various folks claiming that the Wii U is both more and less capable than PS360. Ok, I think I'm going to go play some games now. :)

Nice finding. Now we can label it much more accurately. This will make narrowing down the last few things a lot easier.

I'll have edited version based on this up eventually if no one else make one first.

As for the bolded, perhaps the simplest and most reasonable explanation is that its not 160 as over half the people analyzing the chip in this thread have stated. I forgot how that claim even came about to begin with.
 
And Trine 2: Director's Cut, and Deus Ex Director's Cut, and the Bayonneta 2 dev trailer(them 130k triangles).

The Deus Ex DC on Wii U is just using the DLC's upgraded visuals for the rest of the game. All the other platforms had those upgraded visuals with the DLC. NFS has less cars in multi-player because the CPU is slow.
 

krizzx

Junior Member
The Deus Ex DC on Wii U is just using the DLC's upgraded visuals for the rest of the game. All the other platforms had those upgraded visuals with the DLC. NFS has less cars in multi-player because the CPU is slow.

That is a lie. The dev specified Wii U enhancements, not DLC enhancements. The Wii U version already has the DLC content built into the game. These are enhancements taken a step further for the Wii U version of the game that are beyond the PS3/360 versions.
http://www.penny-arcade.com/report/...-wii-u-has-graphic-improvements-new-game-play

Even if the CPU were "slow"(low clock is not the same as slow), that wouldn't affect that amount of cars in multiplayer. That is completely unheard of. The correlation makes absolutely no sense.
 
A

A More Normal Bird

Unconfirmed Member
The Deus Ex DC on Wii U is just using the DLC's upgraded visuals for the rest of the game. All the other platforms had those upgraded visuals with the DLC. NFS has less cars in multi-player because the CPU is slow.

I've pointed out the Deus-Ex DLC thing before but apparently it keeps falling on deaf ears. That said, the shadow resolution and (post) AA is improved and the developers did state that they upped the quality on on some things over the PS3/360 versions

That is a lie. The dev specified Wii U enhancements, not DLC enhancements. The Wii U version already has the DLC content built into the game. These are enhancements taken a step further for the Wii U version of the game that are beyond the PS3/360 versions.

It's not a lie. Why would anyone lie about this? The bulk of the improvements are from backporting the improved shading system from the DLC. If you had to choose a console version of the game to buy and were judging solely by visuals, you'd choose the Wii-U version, because the others only use the improved shaders in the DLC. But they still use them. EDIT: Rereading your post, I can kind of see where you're coming from in regards to Heavy's use of the word "just". Lie is too emotive a word to use though when it's more ignorance of a minor detail.

Here are some quotes, directly from the developers:
"After the release we continued to work with the game engine to improve some stuff because we knew we were doing DLC," he said. "In missing link, the visuals are better. We took all of that experience from Missing Link and added it to the entire game when working on the Wii U version."

"Even some of the graphics that were improved already on the 360 for The Missing Link, we were able to go a little bit farther with the Wii U, just because of the hardware. In terms of the shadows, which are smoothed out, it's much different than the Xbox version."
 

SmokeMaxX

Member
The Deus Ex DC on Wii U is just using the DLC's upgraded visuals for the rest of the game. All the other platforms had those upgraded visuals with the DLC. NFS has less cars in multi-player because the CPU is slow.

Interesting claim. Care to support it with evidence?
 

JordanN

Banned
And Trine 2: Director's Cut, and Deus Ex Director's Cut, and the Bayonneta 2 dev trailer(them 130k triangles).
Not sure why this is being quoted. Especially when it could easily be a high resolution mesh never used in game.
LWoQIAc.jpg

The model is even being shown as quads whereas video games typically render out triangles (unless you're a Sega Saturn).
 
At this point, I'm not sure using games to support a claim is effective. Some games looks better, some don't. Given this information, it is impossible to conclude which system is more capable overall. Every example that shows Wii U having superior visuals can be countered with another example that looks inferior and vice-versa.

Even if Wii U is definitely more powerful, the degree to which it surpasses the other consoles is probably not significant enough to translate to an obviously superior image. And by "obviously," I mean that it is generally accepted that whatever Wii U is doing can't be done on other consoles.
 

krizzx

Junior Member
I've pointed out the Deus-Ex DLC thing before but apparently it keeps falling on deaf ears. That said, the shadow resolution and (post) AA is improved and the developers did state that they upped the quality on on some things over the PS3/360 versions



It's not a lie. Why would anyone lie about this? The bulk of the improvements are from backporting the improved shading system from the DLC. If you had to choose a console version of the game to buy and were judging solely by visuals, you'd choose the Wii-U version, because the others only use the improved shaders in the DLC. But they still use them. EDIT: Rereading your post, I can kind of see where you're coming from in regards to Heavy's use of the word "just". Lie is too emotive a word to use though when it's more ignorance of a minor detail.

Here are some quotes, directly from the developers:
"After the release we continued to work with the game engine to improve some stuff because we knew we were doing DLC," he said. "In missing link, the visuals are better. We took all of that experience from Missing Link and added it to the entire game when working on the Wii U version."

"Even some of the graphics that were improved already on the 360 for The Missing Link, we were able to go a little bit farther with the Wii U, just because of the hardware. In terms of the shadows, which are smoothed out, it's much different than the Xbox version."

You quote the very statements that make his claim a lie.

He clearly stated that it was just the DLC improvements as to say that it showed no advancement over the 360/PS3 where the dev clearly states otherwise.

Those statements were not made guessingly. They were asserted as if they are fact.

At this point, I'm not sure using games to support a claim is effective. Some games looks better, some don't. Given this information, it is impossible to conclude which system is more capable overall. Every example that shows Wii U having superior visuals can be countered with another example that looks inferior and vice-versa.

Even if Wii U is definitely more powerful, the degree to which it surpasses the other consoles is probably not significant enough to translate to an obviously superior image. And by "obviously," I mean that it is generally accepted that whatever Wii U is doing can't be done on other consoles.

It is not impossible to conclude at all. All it takes is logic and facts. To simply say "Some games looks better, some don't" is to completely disregard all details and nuances that actually make a difference.

You do not address "why *insert game* looks better/worse. Details make quite huge difference in any argument. There are no games that "look" worse. There are only games that perform worse or have missing features. All of said games were early ports that were developed mostly on dev kits that used different hardware than what is in the final retail Wii U by people who were unfamiliar with the hardware. There was also an issue with one of the CPU cores not being used for most of the development. Those problems have not manifested themselves in a game since launch.

On the other hand, all of the games that perform better attribute that specifically to the significantly more powerful hardware.
 
A

A More Normal Bird

Unconfirmed Member
You quote the very statements that make his claim a lie.

He clearly stated that it was just the DLC improvements as to say that it showed no advancement over the 360/PS3 where the dev clearly states otherwise.

Those statements were not made guessingly. They were asserted as if they are fact.
Unawareness of smaller details =/= lie, even if they are asserted as fact. I find it difficult to believe that anyone (without financial involvement) who isn't a child would lie about something like this, though obviously you disagree. Earlier in the thread a poster said to me that because the Wii-U was able to use the improvements in the PC DLC whereas the other consoles were not that was proof it was more powerful. I didn't call them a liar, I explained where they were wrong. Getting personally attached to and emotive over arguments such as these serves no purpose.

At this point, I'm not sure using games to support a claim is effective. Some games looks better, some don't. Given this information, it is impossible to conclude which system is more capable overall. Every example that shows Wii U having superior visuals can be countered with another example that looks inferior and vice-versa.

Even if Wii U is definitely more powerful, the degree to which it surpasses the other consoles is probably not significant enough to translate to an obviously superior image. And by "obviously," I mean that it is generally accepted that whatever Wii U is doing can't be done on other consoles.

Though I mostly agree with your overall conclusion, I'd say it's generally more likely that poor performance is caused by poor programming/optimisation than better visuals are by better programming/optimisation.
 

krizzx

Junior Member
RE: Revelations Wii U performs worse than the other versions. The footage of Deus Ex Wii U also has significant stutter.

Even the PC version has those issues.
http://www.eurogamer.net/articles/digitalfoundry-resident-evil-revelations-face-off
They even state the 3DS version performs better than Wii U. I doubt the in any component in the 3DS is more powerful than something in the Wii U. As much a I despise Eurogamer's intentionally insulting, presumptuous, fanboy baiting writing, this clearly shows that most of the effort was put into the 360/PS3 versions.

Inconsistent stutter is always the result of poor optimization, especially when you have instances of the frame rate exceeding the cap like we have here.

I'm not seeing the Deus Ex stutter though. Animations are jerky sometimes, but other than that it looks pretty solid.
http://www.youtube.com/watch?v=I7rkeYwH1Fg
 
As for the shaders themselves, I never claimed to have all the answers and I can't name exactly why they are so much larger than would seem to be necessary for 160 40nm SPUs. I've thrown out a few guesses in previous posts and I am sure that there have been a few modifications/shortcuts made especially for the Wii U hardware. Wsippel linked to an interesting article on thread interleaving - could be, but who knows? We could speculate forever (and that might be fun), but there's nothing else we can really say about them without actual Wii U developer insight. At this point, I don't know if I have much else to say about the Wii U GPU. I have attempted to analyze to the best of my abilities, without bias. People can believe or disbelieve as they please, just as I am sure that new game footage at E3 will have various folks claiming that the Wii U is both more and less capable than PS360. Ok, I think I'm going to go play some games now. :)
Thanks for the effort fourth storm and everybody else. It's an interesting read.


There are no games that "look" worse. There are only games that perform worse or have missing features.
Inconsistent stutter is always the result of poor optimization
Lower framerate -> doesn't count
Missing trees -> doesn't count
Missing cars -> doesn't count
Launch game -> doesn't count
Stutter -> doesn't count

I think you couldn't possibly be more selective with your criteria to exclude the cases that don't fit with your opinion while avoiding any evaluation of the cases that do fit with your opinion.
 
Shin'en mostly uses procedural textures, which means they can store infinite terabytes of textures in a few lines of code. You're confusing them with Two Tribes. And as you said, they probably didn't even try to use several gigabytes of textures in the first place. For all we know, that tweet could have been about squeezing 220MB down to 20MB. The whole game is only 536MB as it is.

WayForward also managed to shrink Mighty Switch Force substantially after release, but I believe that was audio related.

Didn't the Toki Tori developers say that they found a secret function on the Wii U that gave them some extra RAM? I think it was on twitter.
 
Status
Not open for further replies.
Top Bottom