• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

thirty

Banned
checking out the wii u firsthand this weekend i can say its definitely more powerful than ps360 but honestly the bump reminds me of what it used to be like when you upgraded your videocard in your PC a few years back. yeah you'd get better framerates, textures and res bumps and it all looked nice but ultimately you were still playing the same games.
 

The_Lump

Banned
Until we have proof that is more concrete than this, I think it is only logically to accept this as the GPU base, for now, no?


Hmmm. I'm not so sure. They only mentioned e6760 because I questioned them about it.

Also: either they do actually somehow know something and have now been given clarification that's its modified immensely rather than just tinkered-with e6760 (highly unlikely) or this is the most perfect example of GAF>Internet>GAF>Internet>GAF
 
Hmmm. I'm not so sure. They only mentioned e6760 because I questioned them about it.

Also: either they do actually somehow know something and have now been given clarification that's its modified immensely rather than just tinkered-with e6760 (highly unlikely) or this is the most perfect example of GAF>Internet>GAF>Internet>GAF

It would be the most hilariously awesome situation if they only sent me the follow up in response to what was read in these very threads discussing the original messages
 

Linkup

Member
or slower. which is why the email warns about making presumptions about Wii U power from knowing what part it was based on.

Wouldn't slower be strange considering the trouble it would cause for devs? Then you have the improvement in games as we near launch though that happens for basically all games launching on a system. Last thing is Nintendo's history, didn't they upgrade the GC and 3DS not so far from launch?

Any modifications should make it more efficient/powerful.
 

AlStrong

Member
Like could Nintendo technically add more texture units if they wanted?

Short answer is yes, but the architecture was designed such that there was one TU per SIMD, so you'd have to change the number of SIMDs. The SIMDs themselves could have an arbitrary number of VLIW5 cores, so you could potentially increase the number of texture units whilst keeping the number of ALUs the same (to not bloat the die size as much).

The RBEs (groups of 4 ROPs) were decoupled entirely, but there's only so many you can fit on the chip & balance for a target bandwidth.


texture-l2-aligned-512.gif
 

Absinthe

Member
Let us summarize everything around the rumors of the E6760.

Fact -
The Wii U will be using Green Hills Software's MULTI integrated development environment.

Fact -
The first embedded GPU from AMD to use Green Hills Software will be the E6760.

Fact -
AMD support reps are revealing via email that the Wii U GPU is to be based on the E6760, or at least an embedded GPU.

Conclusion -
I think it is 'safe' to say, for now (based on simple deductive reasoning), that we can accept some form of the E6760 as the Wii U GPU.
 

Absinthe

Member
Hmmm. I'm not so sure. They only mentioned e6760 because I questioned them about it.

Also: either they do actually somehow know something and have now been given clarification that's its modified immensely rather than just tinkered-with e6760 (highly unlikely) or this is the most perfect example of GAF>Internet>GAF>Internet>GAF

As did I, but they did not give me specifics like they did for you.
 

The_Lump

Banned
It would be the most hilariously awesome situation if they only sent me the follow up in response to what was read in these very threads discussing the original messages


We should plant some kind of easter egg in our discussions, and see if it makes it into their next email :)

He's probably watching right now!

Hello AMD tech guy! Thanks for confirming that the WiiU is actually a slot machine using a highly customized super-e6760. We'll stop emailing now as this is obviously 100% fact. I'll be investing everything I own in AMD stock now that I'm 1000% certain Wii U is in fact an HD slot machine destined to take vegas by storm. Good health!
 

Absinthe

Member
We should plant some kind of easter egg in our discussions, and see if it makes it into their next email :)

He's probably watching right now!

Hello AMD tech guy! Thanks for confirming that the WiiU is actually a slot machine using a highly customized super-e6760. We'll stop emailing now as this is obviously 100% fact. I'll be investing everything I own in AMD stock now that I'm 1000% certain Wii U is in fact a HD slot machine destined to take vegas by storm. Good health!

Haha! That would be hilarious!
 

The_Lump

Banned
Let us summarize everything around the rumors of the E6760.

Fact -
The Wii U will be using Green Hills Software's MULTI integrated development environment.

Fact -
The first embedded GPU from AMD to use Green Hills Software will be the E6760.

Fact -
AMD support reps are revealing via email that the Wii U GPU is to be based on the E6760, or at least an embedded GPU.

Conclusion -
I think it is 'safe' to say, for now (based on simple deductive reasoning), that we can accept some form of the E6760 as the Wii U GPU.



Naaaaaaah. I still don't see why they would know. And if it were that easy for then to find out (by emailing a colleague, checking a database etc) then AMD has serious security issues!

Let's not forget all the valuable research/speculation & evidence that tells us it's not anything to do with this chip, other than having a similar performance/watt ratio.
 
Naaaaaaah. I still don't see why they would know. And if it were that easy for then to find out (by emailing a colleague, checking a database etc) then AMD has serious security issues!

Let's not forget all the valuable research/speculation & evidence that tells us it's not anything to do with this chip, other than having a similar performance/watt ratio.

Well actually there hasn't been anything saying it isn't the chip...we simply know that the performance of whatever it's using is similar to it. I don't think bg (might be mistaken) has said anything that dispels the possibility of the chip being used as a base
 
Speculation, the part where you've said "some of the extra 1GB will be added to that later on".

And edram have a lot of uses, not only as framebuffer.
He said it was "very likely", that in itself doesn't qualify as simple speculation I'd say, it's simply throwing out a realistic possibility. Nintendo just recently did the same with 3DS in fact, reserving 64MB initially, then freeing up 32MB of that for post launch software. Virtually every modern system does this.

And he also didn't say the edram would only be used for framebuffer.
 

The_Lump

Banned
Well actually there hasn't been anything saying it isn't the chip...we simply know that the performance of whatever it's using is similar to it. I don't think bg (might be mistaken) has said anything that dispels the possibility of the chip being used as a base


Other than all the R&D they've ploughed into the r7xx based chip they already started building. That's the biggest kick in the nads for the e6760 thing.

I think it'll be something quite similar though. And I wouldn't be at all surprised if the two gpu's design/creation was a little co-dependent.
 

tenchir

Member
Naaaaaaah. I still don't see why they would know. And if it were that easy for then to find out (by emailing a colleague, checking a database etc) then AMD has serious security issues!

Let's not forget all the valuable research/speculation & evidence that tells us it's not anything to do with this chip, other than having a similar performance/watt ratio.

AMD had a massive lay off awhile ago. Could it be that some non-CS representatives(Marketing, tech people, etc) are filling in for some of the duties if they are low on people?
 

The_Lump

Banned
AMD had a massive lay off awhile ago. Could it be that some non-CS representatives(Marketing, tech people, etc) are filling in for some of the duties if they are low on people?


Hmmm, bit of a stretch though. They employ an enormous number of people still and unless they laid off their entire CS staff, it's unlikely!

Let's not get carried away again ;) I'm just taking this as a bit of fun at the moment.
 

wsippel

Banned
Let us summarize everything around the rumors of the E6760.

Fact -
The Wii U will be using Green Hills Software's MULTI integrated development environment.

Fact -
The first embedded GPU from AMD to use Green Hills Software will be the E6760.

Fact -
AMD support reps are revealing via email that the Wii U GPU is to be based on the E6760, or at least an embedded GPU.

Conclusion -
I think it is 'safe' to say, for now (based on simple deductive reasoning), that we can accept some form of the E6760 as the Wii U GPU.
I wrote about this yesterday - that line of thinking is bullshit. No, E6760 is not using anything by Green Hills. A company called Alt Software (that has nothing to do with Green Hills, AMD or Nintendo) sells OpenGL drivers for that particular chip on Green Hills' Integrity operating system. And that's it. It has nothing to do with Wii U. Nintendo wouldn't even need that driver to begin with, they don't use OpenGL. They might not even use Integrity.
 

Absinthe

Member
I wrote about this yesterday - that line of thinking is bullshit. No, E6760 is not using anything by Green Hills. A company called Alt Software (that has nothing to do with Green Hills, AMD or Nintendo) sells OpenGL drivers for that particular chip on Green Hills' Integrity operating system. And that's it. It has nothing to do with Wii U. Nintendo wouldn't even need that driver to begin with, they don't use OpenGL. They might not even use Integrity.

It's not a "line of thinking". It is was it is.

What chip IS using Green Hills then? I am curious since you and bg seem to know, without a doubt, that it is not a modified e6760.
http://www.ghs.com/partners/amd_partner.html

Besides that, I don't get the hardline stance (to the point of anger) against anything, anywhere, pointing to a e6760.

Edit: Question - If Nintendo wants the Green Hills IDE to be used, wouldn't it make sense that they are also using the INTEGRITY RTOS for continuity and ease? If so, that might be the smoking gun.

Edit: I see you are putting emphasis on 'using'. Maybe 'working in conjunction with' would be a more accurate term to use?
 

Absinthe

Member
For what it is worth.

'The modern architecture of INTEGRITY is well suited for multicore processors targeting embedded systems. INTEGRITY provides complete Asymmetrical Multiprocessing (AMP) and Symmetrical Multiprocessing (SMP) support that is optimized for embedded and real-time use. Embedded system designers can select the multiprocessing architecture that is right for the task. When coupled with the advanced multicore debugging features found in the Green Hills MULTI® tool suite, developers will reduce their time-to-market while increasing system performance and reliability. '

Source - http://www.ghs.com/products/rtos/integrity.html

The Wii U is using the MULTI® tool suite noted above. So, the question is, why would Nintendo NOT want to use INTEGRITY as well?
 

Absinthe

Member
Green Hills develops IDEs and operating systems. No GPU is "using" anything by Green Hills, just like no GPU is using Windows or Linux or Eclipse or Xcode.

I do know this, and I understand. Sorry for the poor choice of words. I will try to be more careful next time.

Which AMD GPU's are leveraging INTEGRITY then, do we know?

We do know Nintendo has decided to go with the MULTI IDE, and GHS highly recommends MULTI when leveraging INTEGRITY. So, could the inclusion of INTEGRITY in regards to the Wii U be why Nintendo went with the GHS IDE in the first place? It seems extremely probable.
 
See here:

http://www.guerrilla-games.com/publications/dr_kz2_rsx_dev07.pdf

There's more to framebuffer memory consumption than res and AA.

I'm a bit slow but shouldn't that be 18MB for 4x5 720p?. Do they also need a 'front' G-buffer because they're processing it with the SPUs once they've already started drawing another frame on the GPU side?

That space is not including the final render target if it works how I am imagining. I could be miles off here, so let me know!
 

AlStrong

Member
Ahh, thanks for explaining.
I was confusing it with MLAA but it might not have existed then.

Yeah, for KZ3, they traded the QAA for MLAA, saving quite a bit on memory and MSAA resolve speed. There'd be some memory overhead for MLAA ( as you need a copy of the full frame image in XDR), but it's certainly not nearly as much as the MSAA cost.
 
How hard would it be for BG, Ideaman or anyone else with 'sources' to just ask what the GPU chip is based on and / or how many GFLOPS it can push.

It would end this circle of arguments.

Until we know the clock speeds / architecture of the CPU / GPU all the arguing and speculation is pretty pointless tbh...
 
How hard would it be for BG, Ideaman or anyone else with 'sources' to just ask what the GPU chip is based on and / or how many GFLOPS it can push.

It would end this circle of arguments.

Until we know the clock speeds / architecture of the CPU / GPU all the arguing and speculation is pretty pointless tbh...
Not knocking anybody :) But i always wondered how would it be possible at this stage to track the source of a leak? I mean if someone just spills the beans. Or on the other hand, how can Nintendo do such an incredible job witholding the information?

I mean there are hackers that have been able to infiltrate some of the U.S. super secure institutions.
 

ozfunghi

Member
Not knocking anybody :) But i always wondered how would it be possible at this stage to track the source of a leak? I mean if someone just spills the beans. Or on the other hand, how can Nintendo do such an incredible job witholding the information?

I mean there are hackers that have been able to infiltrate some of the U.S. super secure institutions.

Have you seen movies such as Enemy of the State and sorts? Well, that exists in real life... but in real life IT'S NINTENDO! Shit... i've said too much!

Seriously though, i agree. How many registered developers are currently working for WiiU right now? They could all be the source of a leak. I understand that 6-12 months ago, only a few devs had exclusive information that -once leaked- could easily be traced back to them. But so close before launch? Case in point: has Arkam been fired yet?
 

AzaK

Member
Not knocking anybody :) But i always wondered how would it be possible at this stage to track the source of a leak? I mean if someone just spills the beans. Or on the other hand, how can Nintendo do such an incredible job witholding the information?

I mean there are hackers that have been able to infiltrate some of the U.S. super secure institutions.

The information can remain secret because Nintendo aren't even telling their developers. If there's a developer reading that wants to prove me wrong, then go for it, I dare ya :)
 

Eteric Rice

Member
They could do something nutty like change the specs (slightly) on the dev units for each company. So when someone releases the specs, they know which company the leak originated from.
 
^ For awhile I felt that MS was doing that with Xbox 3 info.

I think you're partially right, in that it's intended both as a framebuffer and as low-latency GPGPU memory.

In fact, that brings me to another thought I'd had. I've been of the opinion for a while that this chip being manufactured in Fab 8 is the Wii U GPU, something which would imply that the eDRAM is on-die with the GPU, rather than on a separate die. If IBM is involved in manufacturing the GPU, there might be other implications, though. A few pages back, Matt talked about the Wii U's GPU having a "significant" increase in registers over the R700 series. More register memory would be a benefit to GPGPU functionality, but usually comes at the expense of added transistors, which means higher power usage, more heat and a larger, more expensive die.

But what if Nintendo has replaced the register memory (which I assume is usually SRAM) with eDRAM? IBM's eDRAM takes up roughly one third the transistors of SRAM, and has similar power and heat benefits. With it, they could double the available register memory, from 128 registers per thread to 256, while decreasing the transistor count over the reference R700. Alongside the extremely low memory latency that would come with an on-die 32MB pool of eDRAM, and the 32nm manufacturing process, this would make for a GPU which is very GPGPU code friendly, while still consuming very little power, both things which Nintendo seems to be focussing on. It would also be an answer to the question posed a few pages back of "Why would Nintendo spend years modifying a R700 rather than just using something like an E6760?"

I was thinking of something pretty much the same as this. I don't feel as crazy now. :p

Was that "Wish they had in the 360 days"?

Yes.
 

ozfunghi

Member
They could do something nutty like change the specs (slightly) on the dev units for each company. So when someone releases the specs, they know which company the leak originated from.

We're less than 2 months from launch. Every developer actively working on WiiU titles must know the specifics of the hardware he is working for i assume. They might not be able to tell what chip was the basis for the eventual retail WiiU GPU, but what speed it is running, how many ALU's it has etc... surely this can't be info Nintendo can withhold from its developers.
 
They could do something nutty like change the specs (slightly) on the dev units for each company. So when someone releases the specs, they know which company the leak originated from.
I don't know, but if it was the case, at least we would have hear something like "more or less between "x" ROP's or between so and so MHZ". Not even that.

Knowing the amount of pation Nintendo inspires in it's fanbase i found it amazing that some of these guys ( real smart and resourceful ones) haven't duck anything. You get some people reverse engineering sh!t and you gotta tell me somebody hasnt been able to dig some concrete specs of a piece of hw by now?

This is what i find to be the most amazing thing about the WiiU. How does Nintendo pulls this sh!t off. They can get a ton more money by being security consultants for some goverments.
 
We're less than 2 months from launch. Every developer actively working on WiiU titles must know the specifics of the hardware he is working for i assume. They might not be able to tell what chip was the basis for the eventual retail WiiU GPU, but what speed it is running, how many ALU's it has etc... surely this can't be info Nintendo can withhold from its developers.

XD
 
Good fucking god.

People need to stop contacting AMD tech support and posting their replies as if it means shitall.

If it turns out to be an E6760, it turns out to be an E6760.

But AMD tech support is not going to know what's in the Wii U. To believe they would is frankly all sorts of utterly stupid.
 

brainpann

Member
I don't know, but if it was the case, at least we would have hear something like "more or less between "x" ROP's or between so and so MHZ". Not even that.

Knowing the amount of pation Nintendo inspires in it's fanbase i found it amazing that some of these guys ( real smart and resourceful ones) haven't duck anything. You get some people reverse engineering sh!t and you gotta tell me somebody hasnt been able to dig some concrete specs of a piece of hw by now?

This is what i find to be the most amazing thing about the WiiU.How does Nintendo pulls this sh!t off. They can get a ton more money by being security consultants for some goverments.

Nintenjas....


Good fucking god.

People need to stop contacting AMD tech support and posting their replies as if it means shitall.

If it turns out to be an E6760, it turns out to be an E6760.

But AMD tech support is not going to know what's in the Wii U. To believe they would is frankly all sorts of utterly stupid.

Someone should email AMD and ask what is in the PS4 and NeXtbox just to see what the reply is.
 
Good fucking god.

People need to stop contacting AMD tech support and posting their replies as if it means shitall.

AMD tech support is not going to know what's in the Wii U. To believe they would is frankly all sorts of utterly stupid.
That's the equivalent of the timmy kid writing a letter to Santa, really that's exactly how it looks like... for shame!
 

Nightbringer

Don´t hit me for my bad english plase
^ For awhile I felt that MS was doing that with Xbox 3 info.



I was thinking of something pretty much the same as this. I don't feel as crazy now. :p



Yes.

Well is a possibility,but I am more with the idea that the eDRAM in the GPU is a replacement for the L2 cache.
 
Don't you XD me, MF!

Seriously. How can developers actually develop a game without such knowledge?

LOL. They have the features necessary for development, so they're good.

Well is a possibility,but I am more with the idea that the eDRAM in the GPU is a replacement for the L2 cache.

?

It still has L2 cache along with the pool of eDRAM. That notion was about increasing the register memory for GPU compute purposes.
 

AzaK

Member
I was playing Mario Kart Wii with my little boy last night (3yo). He even managed to almost get around a track by himself, for the first time ever. We played 8 races, had a laugh, got our arses kicked on some levels and served up our own arse kickings on others. High fives flew whenever we did something cool and we both went to bed happy.

Unfortunately it was all ruined by the SD graphics and underpowered GameCube tech with stereo sound coming from the TV.







I don't think I really give a shit about Wii U specs anymore.
 
Top Bottom