• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Absinthe

Member
I'm guessing slightly gimped, but they wouldn't take out DX11 functionality would they? That would be great news

Edit: Let me say though, I'm still riding the skeptic train on these emails. Just seems really odd.

So how about all the skeptics email them then?
 

Indyana

Member
Guys, that email was REAL. I just decided to email AMD on a whim and they sent me the EXACT same thing.

p2NpW.png

I'm speechless.
 
hoping this does not mean gimped.

It means customized. They're not going to use an off the shelf part. Like has been hinted by some other users like BG and such it's supposed to have some fixed function aspects as well. That's not going to be part of the default chip, it probably also doesn't have the 1 gig of ram embeded on the chip board the way the default e67xx does.

*edit*

What ever chip they did base it on if its e67xx or not, it's going to be with some adjustments made, because its going to be a custom chip. It will not be an off the shelf part.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
http://forum.beyond3d.com/showpost.php?p=1668358&postcount=2583

The tech support team at AMD doesn't have any information about console products. AMD does not provide end-user support for those, so there is no reason to provide that team with any info. Especially not for currently unreleased products. When was the last time you emailed AMD about a problem with your Xbox360 or Wii? You are not our customer. Our customer for these chips is MS or Nintendo, who pay many millions of dollars. When they have support issues/questions, they do not go through the public-facing tech support team.

They (and their lawyers) also expect us to keep our mouths shut.

This is either fake, or a support guy trying to sound like he knows something when he does not. Either way, it tells you nothing.

The vast majority of people inside AMD have no idea about the details of the WiiU. It was done by relatively small team and any information outside that team was "need-to-know". Even if you surveyed the GPU IP team which originally designed the base GPU family, >95% of them could not tell you what the configuration is. Only a few needed to be involved to get the specific configuration correct and working, and they know to keep their mouth shut. All additional modifications were done by the "need-to-know" team.

Before you ask.. Yes, I know all the details. No, I will not tell you any of them.
 

Ryoku

Member
Guys, that email was REAL. I just decided to email AMD on a whim and they sent me the EXACT same thing.

p2NpW.png

The wording is weird.

"The E6760 is a specialized, embedded GPU that has been modified for the Wii U. It is a 67xx series GPU, but with adjustments made to optimize performance for the Wii U's functionality."

It makes it seem as though the E6760 itself is a customized/adjusted 67xx series GPU that has been made specifically for the Wii U's functionality, but I doubt that's true.
 

jerd

Member
you question about DirectX 11? I've read around here that DirectX 11 = Microsoft so yeah you can't just go and use it on a non-microsoft product but its key features is what the guy seems to be saying you can do

Yeah I guess that makes sense. I know they can't use actual DX11, but it kinda sounded strange to me how he sounded so unable to say yes or no. I guess he was just talking about it not actually being DX11 though.
 
Guys, that email was REAL. I just decided to email AMD on a whim and they sent me the EXACT same thing.

p2NpW.png

So, what we can safely guess:

CPU: 3 enhanced Broadway CPU Cores with increased cache
GPU: Modified E6760 of possibly 576 GLOPs, 480 shader processors, multiple display support (Wii U pad obviously), 128-bit memory interface.
RAM: 2GB of unknown type RAM (1GB is currently reserved for the OS) as well as 32MB of eDRAM.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I really don't agree with this. But I've seen "GP" cover so many different aspects that I really don't have a clear idea what you may be alluding to.
Well, colloquially, the GPGPU domain is considered everything that up until recently would run on CPUs, but today is being migrated to GPUs. Just because GPUs do these tasks better today. The fact a given 'GP' task historically started life on CPUs does not mean CPUs were well suited for it, or that the task was 'GP' as per today's understanding - CPUs are the default choice for a lot of tasks just due to historical reasons. I.e. until somebody discovers/designs a common part that does said task better.

GPGPUs are pretty much a huge collection of SIMD units. Alot like Cell SPE's except exponentially more HP.
Entire domains of tasks could benefit from a huge collection of ALUs, SIMD or otherwise.

It shares the same drawbacks though.

1. Memory Latency. The hardest one to shake because GPUs are sorta built to tolerate latency rather than combat it.
I'm not sure I follow. What's the difference between tolerate and combat in this case?

2. A Wide-Simd lane so it will generally struggle with anything logical
You mean it won't be efficient. But whether it will struggle or not depends on what is the logical code under consideration and what is the GPU ISA.

3. A lack of branch HW. This was the case for this gen so it won't be that big of a change next-generation but the few developers I know aren't fond of this at all.
I assume you mean lack of branch prediction hw, since GPUs have had flow control hw for some time now. It's not equivalent to CPU branch hw, but it could not be - branching in a massively parallel processor is an entirely different problem to branching in a single control flow.

Personally, I would rather just integrate a few decent SIMD units in the CPU and let my GPU do its thing but perspective is always intresting.

What would you rather see?
Of course a CPU needs its 'private' SIMD units, if nothing else just for sporadic low-latency tasks. The tricky question, though, is: How much is enough? Are you sure you're spending your transistor budget wisely by placing those SIMD units with the CPU?

Surprisingly (or not) here's what Intel more or less did with Larrabee - they took some mature-design cores, slapped some advanced-design SIMD silicon on them (and by that I mean lots and lots of it), packed them all together on a coherent fat SMP infrastructure and gave that contraption to some very clever sw guys (game industry veterans, et al) to 'do something amazing with it'. We know how that ended - Larrabee could run its own debugger, but at performance/watt at typical GPU tasks it could not compete with actual GPUs. So a CPU with tons of SIMD resources is not the panacea some saw in it.

I started my career with software rasterizers, so I'm not cold to GP-friendly SIMD arrays myself. Heck I fancy them. I even might actually think they're the future of graphics. But fat SIMD arrays have their own specific needs, which may not align well with the needs of a common CPU design, BW being a very apparent discrepancy, another being the lack of massive hw schedulers in the CPU (hyperthreading is barely scratching the surface of GPU schedulers). By putting such SIMD resources too deep in the CPU domain you inadvertently subject them to the 'ways of the CPU' - sw scheduling, small mem pools of big BW and large pools of not-so-big, perhaps abysmal (for the SIMDs) BW.
 

Absinthe

Member
Can't blame him when people are falling for obvious fakes like that email. Why would any customer service agent have any clue what is in the Wii U?

Or, maybe he is in CYA mode? We all have seen it before. Just deny (somewhat) what has been leaked to try and get back what was lost, but it is always too late.

I think that is the case here, as you can clearly see it by his 'attitude'.
 

Nezzhil

Member
I can't believe it was as easy as asking AMD.

Secondly, the Spanish guys that asked first, the real pionners. Now all, ask AMD, Intel, IBM for the specs of Durango and Orbis systems. XD
 

TyRaNtM

Neo Member
So, what we can safely guess:

CPU: 3 enhanced Broadway CPU Cores with increased cache
GPU: Modified E6760 of possibly 576 GLOPs, 480 shader processors, multiple display support (Wii U pad obviously), 128-bit memory interface.
RAM: 2GB of unknown type RAM (1GB is currently reserved for the OS) as well as 32MB of eDRAM.

Sounds good.
 

mrklaw

MrArseFace
Of course a CPU needs its 'private' SIMD units, if nothing else just for sporadic low-latency tasks. The tricky question, though, is: How much is enough? Are you sure you're spending your transistor budget wisely by placing those SIMD units with the CPU?

Surprisingly (or not) here's what Intel more or less did with Larrabee - they took some mature-design cores, slapped some advanced-design SIMD silicon on them (and by that I mean lots and lots of it), packed them all together on a coherent fat SMP infrastructure and gave that contraption to some very clever sw guys (game industry veterans, et al) to 'do something amazing with it'. We know how that ended - Larrabee could run its own debugger, but at performance/watt at typical GPU tasks it could not compete with actual GPUs. So a CPU with tons of SIMD resources is not the panacea some saw in it.

I started my career with software rasterizers, so I'm not cold to GP-friendly SIMD arrays myself. Heck I fancy them. I even might actually think they're the future of graphics. But fat SIMD arrays have their own specific needs, which may not align well with the needs of a common CPU design, BW being a very apparent discrepancy, another being the lack of massive hw schedulers in the CPU (hyperthreading is barely scratching the surface of GPU schedulers). By putting such SIMD resources too deep in the CPU domain you inadvertently subject them to the 'ways of the CPU' - sw scheduling, small mem pools of big BW and large pools of not-so-big, perhaps abysmal (for the SIMDs) BW.


It's all a balance I guess. If you have lots of GPGPU capacity, then aren't you sacrificing normal gpu capacity as its stealing space on the die. And what sort of thing do games programmers need to do that would benefit from GPGPU but wouldn't be suited for a multicore CPU to chew over?

I just think there is a risk that the system can become unbalanced with the GPU being asked to do too much
 

The_Lump

Banned
I said I'd lol and rofl if it turned out to be as easy as asking amd.


Lol.

Rofl.


Could have just asked BG a few months ago though : p
 

Absinthe

Member
You'd think they would just say "We cannot comment on rumors and speculation."

Well, remember that we did have 'a guy' claim back in July that he was a former ATI employee, and most of what he claimed he knew about the Wii U came to pass as of Sept 13th. Interestingly enough, he also said that the GPU was a modified e6760.... so, there is that little nugget from July to go along with the two press releases relating to an e6760, and now we have this AMD "leak".

All added up, I think its pretty safe to say this just might be legit.
 

ozfunghi

Member
Well, remember that we did have 'a guy' claim back in July that he was a former ATI employee, and most of what he claimed he knew about the Wii U came to pass as of Sept 13th. Interestingly enough, he also said that the GPU was a modified e6760.... so, there is that little nugget from July to go along with the two press releases relating to an e6760, and now we have this AMD "leak".

All added up, I think its pretty safe to say this just might be legit.


What press release?
 

Vinci

Danish
Someone write AMD a message and ask if a completely different card is in the Wii-U. If they're full of shit, they should tease that it is; if they're not, they'll correct you with the e6760.
 

ozfunghi

Member
Someone write AMD a message and ask if a completely different card is in the Wii-U. If they're full of shit, they should tease that it is; if they're not, they'll correct you with the e6760.

I was just about to suggest the same thing (wanted to post an hour ago but was too lazy). Someone ask AMD if the E2400 is in the WiiU and see what they say :)


Ok, for a moment i thought you were talking about those two emails, lol.
 

roddur

Member
Well, remember that we did have 'a guy' claim back in July that he was a former ATI employee, and most of what he claimed he knew about the Wii U came to pass as of Sept 13th. Interestingly enough, he also said that the GPU was a modified e6760.... so, there is that little nugget from July to go along with the two press releases relating to an e6760, and now we have this AMD "leak".

All added up, I think its pretty safe to say this just might be legit.

in that claim the guy said it's 824 Mhz. but AMD's site say e6760 is 800 Mhz.
 

Azure J

Member
This email thing is just too janky to put any belief into imo.

why is a GPU chip being treated like National Security?

everyone knows SONY/MS will be way beyond this GPU why not share the info?

Could just be perception control. Look at how folks go on currently based on the notion of the base chip being old (R700 architecture). Now imagine if there was a hard line answer of what it was especially with MS and Sony poised to unleash AMD HD/77907850 based monsters in the next year or so.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
in that claim the guy said it's 824 Mhz. but AMD's site say e6760 is 800 Mhz.

He also claimed that the amount of EDRAM on chip hadn't been finalized when he left, which I thought was quite a hilarious claim for a console that was 6 months from launch, i.e. Bullshit :p
 
This email thing is just too janky to put any belief into imo.



Could just be perception control. Look at how folks go on currently based on the notion of the base chip being old (R700 architecture). Now imagine if there was a hard line answer of what it was especially with MS and Sony poised to unleash AMD HD/77907850 based monsters in the next year or so.

someone is going to gut one open in a few months anyway
 

Absinthe

Member
He also claimed that the amount of EDRAM on chip hadn't been finalized when he left, which I thought was quite a hilarious claim for a console that was 6 months from launch, i.e. Bullshit :p

Maybe. Who really knows how long they may wait to finalize certain things? I don't think that point alone makes his comment BS. So far, his claims do not contradict anything yet. I wouldn't count it out as an option considering everything else.
 
Top Bottom