Not long. Someone from Chipworks is going to help us out even further.
Chipworks are good people.
I'm still looking for info on how we arrived at 20 ALUs. I just want this shit cleared up as much as possible with concrete links to where the stuff we DO know about, is coming from.
It's not "more detailed", it's showing something completely different. The one they have on sale is only showing the metal layer above. They've sent us that photo as well, it would have been completely worthless. Chipworks went back to the lab and made "real" die shots showing the actual silicon for us. I'm really grateful for that, as what I suggested to buy would have been a huge waste of money. That would have been quite embarrassing...
The fact we know it has DX10 or 11 features seperates it. The ram as well which even in early WiiU exclusives look to be allowing for things HD twin game never enjoyed with or without sacrifice.
So MAYBE the GPU is overall potentially better than the PS3's/360's, but by the time games actually looking better (if that's possible) come out, no one will give a shit anymore.
Not really. It's apparently made by Renesas (Chipworks thinks so as well), so almost definitely UX8 - 40nm CMOS. And I'm pretty sure it's manufactured at the Renesas Yamagata plant.Are we certain in the process yet?
Yeah, and that's fine, but couple to a CPU that is admitedly worse and RAM undoubtedly much slower, that's doesn't really help.
Problem is, Nintendo's best teams have to now make the switch to new tech and HD graphics. 3rd parties won't bother making ports of last gen games look much better (even assuming they can) especially considering that soon the Ps4/720 will be out, and are obliterating WiiU's specs in every possible way.
So MAYBE the GPU is overall potentially better than the PS3's/360's, but by the time games actually looking better (if that's possible) come out, no one will give a shit anymore.
That is why it is paramount that Nintendo has something to show at E3 that looks spectacular.
That is why it is paramount that Nintendo has something to show at E3 that looks spectacular.
Yeah, and that's fine, but couple to a CPU that is admitedly worse and RAM undoubtedly much slower, that's doesn't really help.
Problem is, Nintendo's best teams have to now make the switch to new tech and HD graphics. 3rd parties won't bother making ports of last gen games look much better (even assuming they can) especially considering that soon the Ps4/720 will be out, and are obliterating WiiU's specs in every possible way.
So MAYBE the GPU is overall potentially better than the PS3's/360's, but by the time games actually looking better (if that's possible) come out, no one will give a shit anymore.
About the top left SRAM: its positioning is strange, but it almost has to be texture cache, right?I think there are actually 37MB of memory embedded in the GPU. The small eDRAM pool above MEM1 consists of four blocks, so that's most likely 4MB. And the 32 identical blocks in the top left corner should add up to 1MB of SRAM.
31%.50% of the chip is Edram, my assumption is that this is a very weak processor
No, they don't. On the HD4870 die shot the non alu count was at best as big as the alu+TMUs part (discounting pci-express logic and memory controller logic).I have to ask again: Can anyone show me where all this fixed function logic is? The non-shader/tmu/cache parts look the same size as any standard GPU to me, and some of that has to be used for BC.
Why does Nintendo go through so much effort to obscure, deflect and hide info about their console hardware?
Didn't they just said that they're are reestructuring to face the chalenge of HD development?
I doubt we'll see something really remarkable in this E3. The same way Capcom games are the standard on 3DS instead of Nintendo.
I would think so. It seems weird to me that they use a lot of eDRAM, then go and "waste" so much space on SRAM, though...About the top left SRAM: its positioning is strange, but it almost has to be texture cache, right?
About the top left SRAM: its positioning is strange, but it almost has to be texture cache, right?
31%.
I would think so. It seems weird to me that they use a lot of eDRAM, then go and "waste" so much space on SRAM, though...
If all your assumptions are correct and everyone on the planet thinks just like you.
Is it 37 or 35MB?
32 + 4 + 1, so 37MB. I guess.Is it 37 or 35MB?
As I've said from the beginning:
Nintendo is going to firmly position themselves halfway between console cycles. Weaker Wii U, then the next console being more powerful than Durango/Orbis and I fully expect it in 4-5 years with Durango/Orbis going 7 in their cycle.
What does this mean for making a DolphinU?
Yes, but conflicting reports on this page, for instance:32mb, with about 3 up top, so 35mb.
I think that's what you're talking about... right? eDRAM?
32 + 4 + 1, so 37MB. I guess.
This thread is one of the reasons why Neogaf is the premier gaming forum on the internet.
No BS, no hyperbole, just the facts laid bare for us to see.
Thanks to Thraktor, Durante and all the other great posters' efforts in the Wii U Tech Discussion thread and their efforts in making contact with Chipworks, this forum is one of the 1st places to reveal the indepth GPU micrograph of the Wii U.
Do you really think that by the time games looking as good as/better than Sony's or Microsoft's best efforts on their current consoles come out, the whole "So is WiiU better or what" topic will get as much attention as it currently does?
Do you really think that by the time games looking as good as/better than Sony's or Microsoft's best efforts on their current consoles come out, the whole "So is WiiU better or what" topic will get as much attention as it currently does?
Thanks to Thraktor, Durante and all the other great posters' efforts in the Wii U Tech Discussion thread and their efforts in making contact with Chipworks, this forum is one of the 1st places to reveal the indepth GPU micrograph of the Wii U.
You don't make friends by being rude and off-topic. Please take that somewhere else as it's unnecessary.You don't get rid of your junior any quicker by kissing ass.
As I've said from the beginning:
Nintendo is going to firmly position themselves halfway between console cycles. Weaker Wii U, then the next console being more powerful than Durango/Orbis and I fully expect it in 4-5 years with Durango/Orbis going 7 in their cycle.
This thread is one of the reasons why Neogaf is the premier gaming forum on the internet.
No BS, no hyperbole, just the facts laid bare for us to see.
Thanks to Thraktor, Durante and all the other great posters' efforts in the Wii U Tech Discussion thread and their efforts in making contact with Chipworks, this forum is one of the 1st places to reveal the indepth GPU micrograph of the Wii U.
Yeah, and that's fine, but couple to a CPU that is admitedly worse and RAM undoubtedly much slower, that's doesn't really help.
Problem is, Nintendo's best teams have to now make the switch to new tech and HD graphics. 3rd parties won't bother making ports of last gen games look much better (even assuming they can) especially considering that soon the Ps4/720 will be out, and are obliterating WiiU's specs in every possible way.
So MAYBE the GPU is overall potentially better than the PS3's/360's, but by the time games actually looking better (if that's possible) come out, no one will give a shit anymore.
Mod already posted a warning so someone could get banned posting thatI'm shocked the X gif hasn't been posted yet...
I would think so. It seems weird to me that they use a lot of eDRAM, then go and "waste" so much space on SRAM, though...
Hollywood eDRAM latency was pretty amazing, but eDRAM should do the trick. I guess. Maybe it's caching CPU - GPU communication? Would that even make sense? One thing's for sure, though: Nintendo loves complex memory subsystems. The audio DSP alone should have at least two equally large local buffers as well, so that it doesn't hog the bus.Perhaps they could not get the latency they needed to match Flipper w/ eDRAM. Or perhaps they just wanted some superfast RAM as an intermediate stage between texture units and the eDRAM (or main pool of DDR3 for that matter).
Sounds like they sacrificed spec for BC. They should have given up on BC and gone balls out on the specs instead.
You don't get rid of your junior any quicker by kissing ass.
For what it's worth, BG had the Orbis specs right months ago, and a lot of his info (as opposed to his speculation) on Wii U was correct. That's entirely off-topic, mind.