• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

JordanN

Banned

Eh, doesn't seem up to page with what other developer's had to say about Wii U. Actually, this line seems kind of contradicting.

"With the Wii U being new hardware, we’re still getting used to developing for it, so there are still a lot of things we don’t know yet to bring out the most of the processing power."
 
Like I say, it's a one off, pretty small in business terms, payment. I want to say it was $200 million for Xenos, but that sounds like a lot.
Sounds very, very off.

I mean Nintendo paid 1 Billion to licence IBM CPU's in 1999; 200 million in 2005 probably don't even cover R&D.

They'd be giving their chips for free at that price; specially if they didn't get royalties for each unit sold later.
Maybe they get a small amount, very small, per piece or something, but that cant be much either.

Look at AMD stock lately, 3 console wins means little
Those consoles aren't out yet, so they aren't making money on them.

And Nintendo most likely owns everything about flipper architecture so they're only making money with X360; but I believe they are.

Also bare in mind that the console market is not as big as the computer market; they sell 325 million computers per year and AMD get's almost 20% of that (2010 numbers); that's 65 million processors; a lot of console's don't reach that amount in their lifetimes and they're licencing, at this point low end chips to them so it can't amount to a lot of money; but it's a good thing to earn it.
And AMD doesn't manufacture anything LOL.
They don't, but they might control the production pipeline or not still; it's understandable if Nintendo, Sony or Microsoft don't want to deal with that; I mean they don't manufacture their console's either; it's up to foxconn and the like.
 
I think this pretty much confirms most (if not all) of what's already been speculated about the system. We knew the Wii U CPU wasn't going to be top of the line, but we also knew that developers wouldn't be pushing it at the start just like NO first generation games push the hardware.

Actually the information about the graphics and ram (which we already knew) are much more reassuring to me.
 
From B3D:

boap9.jpg


Mock if old.

This is relevant as well:

http://www.youtube.com/watch?v=AB3a9G7F8nc&feature=player_embedded
 
From B3D:

Dear ❚❚❚❚

Your service request : ❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚❚ has been reviewed and updated.

Response and Service Request History:

The Wii U utilizes an AMD E6760 GPU, which is a specially-designed, embedded GPU inside the Wii U specifically. This is based around the original design and has obviously been modified for the Wii U and its specific needs and configuration

If you have any other questions or concerns, please do not hesitate to reply to this e-mail directly and I will try to provide any additional information you may require. Thank you for contacting AMD!

In order to update this service request, please respond, leaveing the service request reference intact.

Best regards,

ADM Global Customer Care

Mock if old.
I'd love that to be true.
 
It's spanish....
It flows like portuguese (and I'm portuguese, so) Seemed out of place for portuguese from Portugal though so it felt like brazilian portuguese.

But yeah, confirmed to be spanish; I thought the "agregar a contactos" phrasing would be a little different in that language.
 
Sounds very, very off.

I mean Nintendo paid 1 Billion to licence IBM CPU's in 1999; 200 million in 2005 probably don't even cover R&D.

They'd be giving their chips for free at that price; specially if they didn't get royalties for each unit sold later.Those consoles aren't out yet, so they aren't making money on them.

And Nintendo most likely owns everything about flipper architecture so they're only making money with X360; but I believe they are.

Also bare in mind that the console market is not as big as the computer market; they sell 325 million computers per year and AMD get's almost 20% of that (2010 numbers); that's 65 million processors; a lot of console's don't reach that amount in their lifetimes and they're licencing, at this point low end chips to them so it can't amount to a lot of money; but it's a good thing to earn it.They don't, but they might control the production pipeline or not still; it's understandable if Nintendo, Sony or Microsoft don't want to deal with that; I mean they don't manufacture their console's either; it's up to foxconn and the like.


LOL, you think any of these guys are paying 1 billion to license anything?

I think AMD's entire revenue last year was 1.5B. So you're telling me every console win (3 of them) is worth almost a year of revenue for them? Once again, LOL. And when does this massive revenue start? Wasn't any of the last five years lol.

Also, the way stock works, is you price in things from the future. So it's already known AMD has all 3 next consoles, and the stock is doing terrible, hint.

Winning consoles is very not-lucrative, fact. It's likely AMD does it for reasons mostly other than raw $.
 
LOL, you think any of these guys are paying 1 billion to license anything?

I think AMD's entire revenue last year was 1.5B. So you're telling me every console win (3 of them) is worth almost a year of revenue for them? Once again, LOL. And when does this massive revenue start? Wasn't any of the last five years lol.

Also, the way stock works, is you price in things from the future. So it's already known AMD has all 3 next consoles, and the stock is doing terrible, hint.

Winning consoles is very not-lucrative, fact. It's likely AMD does it for reasons mostly other than raw $.

wrong.

2011 Annual Results

-- AMD revenue $6.57 billion, flat year-over-year

http://phx.corporate-ir.net/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1652123&highlight=
 

pramath

Banned
You guys are crazy lol (in a good kind of way)

It's like you're trying to solve an unsolvable murder mystery.

You're not going to find out who the killer is until November 18th.
 
You guys are crazy lol (in a good kind of way)

It's like you're trying to solve an unsolvable murder mystery.

You're not going to find out who the killer is until November 18th.

That's part of the fun of a shared murder mystery though. Tons of theories and eventual gloating rights.
 
LOL, you think any of these guys are paying 1 billion to license anything?
I don't know how much they're paying (and I never implied they'd make a single payment; it's you who talked about a one off payment; and I quantified how much could that amount to... in 1999); I know though that ATi isn't giving GPU's out for free.

A single 200 million payment is giving it out for free. There are games costing 100 million now, it's peanuts when it comes to hardware and years of R&D involved. And I'm sure ATi is doing it for the money, as is Nvidia is doing with the PS3 (and did with Xbox 1).

There are lots of ways to pay it though; so up till this point I'm betting not much other than R&D has been paid by any manufacturer; because as you say, a single 1 billion payment is a lot even for today's standards; but if it was to be a total payment, that amount would probably be a bargain.
 

jerd

Member
Because someone faked an email?

lol because bg -> gaf -> "tech support" -> B3D -> gaf

Though I'm pretty sure (and bg can correct me if I'm wrong) he never said it was that card, he just said performance may end up similar to that based on info from others. I was just kidding :)
 

jello44

Chie is the worst waifu
You guys are crazy lol (in a good kind of way)

It's like you're trying to solve an unsolvable murder mystery.

You're not going to find out who the killer is until November 18th.

It was Reggie, in the Billiard Room with the Candlestick.
 

Absinthe

Member
If this is true the Wii U has a DirectX 11 compliant 576 GFLOPs off-the-shelf(?) GPU consuming 35W of power. Which is why I'd lean towards not true.

For what it is worth, this is a comment from this site, (sorry if old)
http://www.nintengen.com/2012/07/sp...howComment=1341772104930#c4217504903255964586

'I worked for AMD up until two months ago. The Wii U GPU is based off the AMD E6760 GPU without the onboard memory supplied. At the time I left, I was aware that it will draw from a shared RAM pool of 1.5GB from the Wii U (DDR3-1800) and run at a clock speed of 824 mhz. The main CPU is a IBM power architecture quad core (with one disabled for yield, making it a tri-core) running at 3.0 ghz. The size of the eDRAM was not finalized before I left, nor was the clock speed of the main CPU. I got to work with two different engineering units, one had 3GB ram with the CPU clocked at 3.4 ghz and the other had 1.5GB ram with the CPU clocked at 3.0 ghz. I was told the 3.0 ghz unit was closest to production specs.'
 

Van Owen

Banned
For what it is worth, this is a comment from this site, (sorry if old)
http://www.nintengen.com/2012/07/sp...howComment=1341772104930#c4217504903255964586

'I worked for AMD up until two months ago. The Wii U GPU is based off the AMD E6760 GPU without the onboard memory supplied. At the time I left, I was aware that it will draw from a shared RAM pool of 1.5GB from the Wii U (DDR3-1800) and run at a clock speed of 824 mhz. The main CPU is a IBM power architecture quad core (with one disabled for yield, making it a tri-core) running at 3.0 ghz. The size of the eDRAM was not finalized before I left, nor was the clock speed of the main CPU. I got to work with two different engineering units, one had 3GB ram with the CPU clocked at 3.4 ghz and the other had 1.5GB ram with the CPU clocked at 3.0 ghz. I was told the 3.0 ghz unit was closest to production specs.'


This sounds like complete bullshit. No way the GPU and CPU clocks are that high. No developer would bitch about a 3Ghz cpu.
 
For what it is worth, this is a comment from this site, (sorry if old)
http://www.nintengen.com/2012/07/sp...howComment=1341772104930#c4217504903255964586

'I worked for AMD up until two months ago. The Wii U GPU is based off the AMD E6760 GPU without the onboard memory supplied. At the time I left, I was aware that it will draw from a shared RAM pool of 1.5GB from the Wii U (DDR3-1800) and run at a clock speed of 824 mhz. The main CPU is a IBM power architecture quad core (with one disabled for yield, making it a tri-core) running at 3.0 ghz. The size of the eDRAM was not finalized before I left, nor was the clock speed of the main CPU. I got to work with two different engineering units, one had 3GB ram with the CPU clocked at 3.4 ghz and the other had 1.5GB ram with the CPU clocked at 3.0 ghz. I was told the 3.0 ghz unit was closest to production specs.'

So it's an overclocked E6760 now.
 

User Tron

Member
This sounds like complete bullshit. No way the GPU and CPU clocks are that high. No developer would bitch about a 3Ghz cpu.

Well I don't believe it's clocked at 3Ghz either but I guess that the complains about the cpu are about the fp power. Xenon and Cell are number crunchers, so if the Wii U cpu has no special fp units it could be far less powerful in terms of fp even at 3ghz. In a gpgpu setup the fp power of the cpu is not important so it wouldn't matter.
 
For what it is worth, this is a comment from this site, (sorry if old)
http://www.nintengen.com/2012/07/sp...howComment=1341772104930#c4217504903255964586

'I worked for AMD up until two months ago. The Wii U GPU is based off the AMD E6760 GPU without the onboard memory supplied. At the time I left, I was aware that it will draw from a shared RAM pool of 1.5GB from the Wii U (DDR3-1800) and run at a clock speed of 824 mhz. The main CPU is a IBM power architecture quad core (with one disabled for yield, making it a tri-core) running at 3.0 ghz. The size of the eDRAM was not finalized before I left, nor was the clock speed of the main CPU. I got to work with two different engineering units, one had 3GB ram with the CPU clocked at 3.4 ghz and the other had 1.5GB ram with the CPU clocked at 3.0 ghz. I was told the 3.0 ghz unit was closest to production specs.'

hmm....
 

Goodlife

Member
Question.

It's obviously not a off the shelf GPU, whatever's in it.
So how "modified" is "modified"....

People are saying it's a modified E6760. So it's going to use 35w and is a 576GFlops card...

But would these modifications effect the power it uses / produces or are some things set in stone and the modifications are just tinkering with other features?
 

TheD

The Detective
No way that email is real.

Tech support would have no reason to know what it is, let alone tell people what they know if they did!
 

StevieP

Banned
Which has been confirmed by the manufacturer

And retracted. For a reason.

specialguy said:
well, not exactly...

Before you froth at the mouth too much, you should know that clock-for-clock the G3 lineage outperforms whatever bastardized G4/G5 strip-a-thon you'd want to call the PPE in the 360/PS3 in many areas (certainly not floating point, though).

It's just that you're obviously not dealing with a high clock rate in the case of Wii U.
 
Top Bottom