It missing from the supposed Wii U spec sheet might mean it's there only for BC, though.
Thanks a bunch. How do we go about finding out the number of shaders per cluster? And the fixed-function pipelines are separate from these on the die, yes? The rest of the people here seem to have a hook on 40 shaders per block, which seems reasonable.
And sorry to bother you, it is late as hell over here, too :/ (2:31 a.m.).
Have you guys come to any kind of conclusion on the performance of this thing yet ?
yeah of course they are.... because we know for a fact how much over the history of nintendo they love to bottleneck their systems.
8 shader units with 20 alus in each = 160ALUs @ 550mhz = 176GFLOPs + 24gflops+ of fixed function shaders
Update: sorry I wasn't here to post earlier, the 176gflops is probably correct, since this part does seem to be vliw based. However there is almost certainly at minimum Hollywood inside this die as well considering how Wii u handles backwards compatibility. @550mhz that would give Hollywood 24gflops. Fixed functions are far better at doing their job than programmable shaders, but can do little else. It is more capable than 360, but it is impossible to really compare beyond that.
flops wise its looking like its 352gflops, as Fourth Storm states above a strong case can be made for 40 shaders per block, which would get us a total of 320, which would equal 352gflops. Theres still a lot of unkowns though, a lot of people believe theres a level of fixed function hardware but its pretty unproven at this point.
Basically this thing is 1.5x the 360 at the very least. Beyond3d is claiming its CPU and bandwidth starved and therefor has some serious bottleneck problems. but Iwata says its not. *shrugs*
you can look at the gamecube and say the disc format they chose was a bottle neck. but what B3D is claiming is that the memory/bandwidth of the compenets together will be a bottleneck for the Wii U which ALL we have heard is the opposite. Ancel, Shin'en, and nintendo have all talked about the memory of the system (mainly) the hierarchy and how it is set up. truly just believe its gonna take time for developers to learn how this system works. that shouldn't be a surprise as ALL console need time to get the most out of the hardware where developers get use to how to best use its features... but for some reason we can go ahead and write the Wii U off when it hasnt been out for 6 months yet.
Is it possible that some of the unknown areas in GPU are transistors responsible for gamepad streaming ?
Low consumption = little case = more cash.So it's what has alwayse been pointed at. A little more powerful than XBOX.
I thought third parties were involved in some way in the making of the hardware. Shovelware third party? "Oh yeah sure, we will perflectly run Obut Pétanque on this!"
It's too different in architecture from PS360 so no multiplat. It's too weak compared to Duranbis so no multiplat. I can't understant the design pilosophy behind this hardware.
And why this obsession towards power consumption? Why keeping it SO low? I mean, with 3DS, they had no shame to sell a device with 2H battery life. For a damn handheld. Where is the logic?
Low consumption = little case = more cash.
I'm going to go out on a limb and say that they don't care that much about multiplats.So it's what has alwayse been pointed at. A little more powerful than XBOX.
I thought third parties were involved in some way in the making of the hardware. Shovelware third party? "Oh yeah sure, we will perflectly run Obut Pétanque on this!"
It's too different in architecture from PS360 so no multiplat. It's too weak compared to Duranbis so no multiplat. I can't understant the design pilosophy behind this hardware.
And why this obsession towards power consumption? Why keeping it SO low? I mean, with 3DS, they had no shame to sell a device with 2H battery life. For a damn handheld. Where is the logic?
A high performance memory pool is one of the easiest things to utilize in a GPU design. By a large margin.It missing from the supposed Wii U spec sheet might mean it's there only for BC, though.
i guess anything that could help. i just dont think we are gonna EVER get the exact numbers everyone is looking for. the games are gonna look amazing though. didnt you say a developers said Wii U games will still look amazing after we have seen some great titles for 720 and ps4?
There's a possibility to lighten the die shot to better scrutinize all the brown part besides eDram ? Plox plox plox !
http://www.anandtech.com/show/4479/amd-a83850-an-htpc-perspective
Llano or "redwood" GPU is possible, has 40 ALUs per SPU, we see 8 on Wii U, and 10 here. Thus 352 GFLOPS. Interesting, I'm glad I was probably wrong, as that didn't make much sense given AC3 is a launch game that is handled faithfully on Wii U vs the 360 version.
http://en.wikipedia.org/wiki/Evergreen_(GPU_family)#Radeon_HD_5500
Redwood LE (HD 5550) happens to be 320 ALUs clocked at 550MHz and based on the first link, would have 10 SPUs with 2 disabled. On the Wii U it would just have 8 SPUs...
While we're on that topic: Any idea what the two small SRAM pools are? They appear to have the exact same interface as the blocks in the top left, so they should be plain SRAM, not registers. I guess the one at the top is 128kB, the one at the bottom is 64kB. Or maybe it's 256/ 128, but in that case we should be looking at 2MB SRAM in the top left corner...A high performance memory pool is one of the easiest things to utilize in a GPU design. By a large margin.
it seems to me the design or what we believe to be the deign is the only way to make it work. they had to customize it this way to keep it low power consumption like they wanted. if true that there are many fixed function wouldnt that be easier to pull off then just trying to program everything wouldnt that push power consumption way up?
edit: basically for example fixed function shaders vs programmable shaders?
The 5550HD draws over 35W and is already on a 40nm process.
Obviously it isn't just a HD 5550, but you are comparing a desktop GPU to a console embedded design on MCM, and HD 5550 is 39watt TDP with 2GB GDDR5.
Redwood also happens to be used in e6760, with 50% more shaders (480ALUs vs 320 found in HD 5550) and clocked at 600MHz /w 1GB GDDR5 and still manages to use only 35watts.
Which makes it all the more unlikely imo.
Theres a clear relation in ALU and power draw.
The chart listed does not take into account the power draw for the RAM, AMDs reference powerpoint indicates chip only.
and the e6750 is huge, and based on Turks.
Unless the architecture is significantly more advanced than the DX10.1 Nintendo/AMD directly referenced and the R7xx the previous rumors suggest, I find 320 hard to believe for the constraints leveled against it. IMO
A high performance memory pool is one of the easiest things to utilize in a GPU design. By a large margin.
Apropos, those 'spec sheets' where hardly spec sheets but more like feature sheets.
The chart you linked to had the same TDP (39W) for 512MB, 1024MB and 2048MB.Actually e6760 is a complete solution, it has the ram onboard and is part of the 35watt package. Removing it will drop the wattage to ~30. Redwood LE has 50% less ALUs, or look at it from turks perspective and remove 4 SPUs, giving you 8 SPUs or 320 ALUs. It also would drop the power usage by ~33% giving you about 20watts for what Wii U is likely using.
320 ALUs also happens to be e4690's set up. Which is the R700 line embedded system, also using 25watts with a core clock of 600MHz. Since both are VLIW5 and would both be used in this highly custom chip, it doesn't really matter which we pick from... 320ALUs seems extremely likely given all this new information.
Takeda said...
Unless they're giving concrete specifics people should learn not to hang off of the words of company executives, regarding their system's relative performance, when they have a very vested interest in painting a positive light on their products.
I was referring to a comment above about whether "Takeda was a liar" by saying the CPU isn't "weak." Perhaps I should have quoted.Do you know what I'm referring to? Or are you simply here to lump all comments into one box or another? If you're interested in discussing something that possibly may or may not bear fruit, then please address that specifically rather that discarding the talking points of others into a pile of scrap.
Considering what we are seeing in relation to the supposed "weak cpu", I think we should also factor in whether we think Takeda is a liar or not when he said that "There are people saying that the CPU is weak but that is not true."
Iwata said...
Takeda said...
Unless they're giving concrete specifics people should learn not to hang off of the words of company executives, regarding their system's relative performance, when they have a very vested interest in painting a positive light on their products.
I was referring to a comment above about whether "Takeda was a liar" by saying the CPU isn't "weak." Perhaps I should have quoted.
I.e. No executive is going to come out and say "Our hardware is weak." It has nothing to do with being "a liar."
The chart you linked to had the same TDP (39W) for 512MB, 1024MB and 2048MB.
We don't have to discuss TDP anymore. TDP is pretty much all you have if you haven't seen the die and don't know the clock. We have a photo of the die and know the clock now, so TDP is not really all that interesting anymore.The chart you linked to had the same TDP (39W) for 512MB, 1024MB and 2048MB.
That's certainly better than what the WUST threads had in mind, as they never predicted the Wii U to excel in any area.
If the Wii U has 34 MB of eDRAM, then that's at least one point where it is better than the other next-gen consoles.
That's certainly better than what the WUST threads had in mind, as they never predicted the Wii U to excel in any area.
we are not going to be able to get the answers we want and desire. all the proof of what it can and cant do will be in the games.
1st and 2nd party nintendo titles are gonna look stunning EVEN after ps4 and 720 have games on the market.
It was ballpark speculation at the time based on what we had eyeballed at the event, but the final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.
However, while we now have our most important answers, the die-shot also throws up a few more mysteries too - specifically, what is the nature of the second and third banks of RAM up on the top-left, and bearing in mind how little of the chip is taken up by the ALUs and TMUs, what else is taking up the rest of the space? Here we can only speculate, but away from other essential GPU elements such as the ROPs and the command processor, we'd put good money on the Wii U equivalent to the Wii's ARM 'Starlet' security core being a part of this hardware, along with an audio DSP. We wouldn't be surprised at all if there's a hardware video encoder in there too for compressing the framebuffer for transmission to the GamePad LCD display. The additional banks of memory could well be there for Wii compatibility, and could account for the 1MB texture and 2MB framebuffer. Indeed, the entire Wii GPU could be on there, to ensure full backwards compatibility.
While there's still room for plenty of debate about the Wii U hardware, the core fundamentals are now in place and effectively we have something approaching a full spec. It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.
The system currently locks all games being played at 33-34w according to eurogamer
Simple, low power, GamePad and close to profitable. Something's gotta give, and it's graphical/processing prowess.So it's what has alwayse been pointed at. A little more powerful than XBOX.
I thought third parties were involved in some way in the making of the hardware. Shovelware third party? "Oh yeah sure, we will perflectly run Obut Pétanque on this!"
It's too different in architecture from PS360 so no multiplat. It's too weak compared to Duranbis so no multiplat. I can't understant the design pilosophy behind this hardware.
Hi IdeaMan, been wondering where you got to. Any more info you can get from your sources is always appreciated.Very busy irl those days, but i've followed from afar the "crowdfunding gpu die picture project" on the Wii U technical discussion thread (and i would have loved to give 20$ to FS but when i was aware of this thing, the founds were already raised and it was "closed"), but big thanks to the enthusiasts and chipworks for all of this, this is pure investigation !
I'll try to submit this die picture to some developer sources see if it could unlock some additional info.
The exotic side of this chip is promising, well, only if the tools to take advantage of it are accessible enough for developers of course.
I hear ya but even if it wad 160 shaders it's the games that will speak for themselves.After reading the initial posts yesterday about how the amount of shader units would be lower than expected, I seriously though that it would be significantly less than 320 (because that was still sort of in the latest guessing game of 320-480 SPs).
So yeah, if it's actually 320 + whatever fixed function thingy is on there (Pikmin magic?) + additional eDRAM and SRAM (somewhere in between 4-6MB combined?) than that's actually sort of like our original (already revised, mind you) expectations. So it's pretty decent imo.
If it isn't 320, but rather 160 (but that's not likely anymore afaik?) then... uhoh.
When games like, alien colonial marines, monster hunter at stress scenes, picmin when is overloaded with sprites and animations, etc games, get released. Make a new psu measurements. Or even better, wait for "next gen" ports. Because from my point of view, all next gen consoles are GPU centric consoles, just like wii U.
The 32w measured with mario game and other light gpu games.
From my personal PC - psu experience, if someone tells you the psu is 850w and its 80% silver or 90% gold efficiency, THEY NEVER count the 10% or 20% in the overall watts they sell u. Also, psu's have a safety watt scale +-10 or 20 or 80... I believe wii U gpu have an about 10 to 30+ watts, depends the game. Time will tell.
It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.
Such as?DF bias is really starting to get annoying. They speak in absolutes when we are now more confused then ever with the GPU.
They basically discredit 40% of the chip, why?
I question their credibility in examining performance of games when they seem to have already prejudged Wii U.
DF bias is really starting to get annoying. They speak in absolutes when we are now more confused then ever with the GPU.
They basically discredit 40% of the chip, why?
I question their credibility in examining performance of games when they seem to have already prejudged Wii U.
Nah, like 10 minutes in gimp. also a lot of information is outdatedI agree that this shoud be added to the OP. Looks like ScepticMatt put some time into it.
A good chunk of the silicon is still of unknown nature.Such as?