• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

TheD

The Detective
So is this the new argument from you now?

I think most everyone (who cares) knows general processing on the GPU has been around for awhile. And if it's being put to actual usage in a console for once, then it does deserve acknowledgment.

I was just agreeing with what Thunder Monkey said, that not saying that it is a GPGPU in the leaked specs does not disprove the specs due to the fact that a modern AMD GPU could very safely be assumed to support GPGPU functions.

It has always been my argument that it supports GPGPU functions (I have no idea where you got the idea that it was not my argument).
Just that unlike others I know what that entails, what it can do well, what it can not do well and (unlike you) I know that the 360's Xenos also supports GPGPU functions (that have been used in games).
Thus I know that people who are thinking that it sets the WiiU apart and are pinning hopes of it making up for a slow CPU are wrong!
 
So is this the new argument from you now?

I think most everyone (who cares) knows general processing on the GPU has been around for awhile. And if it's being put to actual usage in a console for once, then it does deserve acknowledgment.
Functionality is one facet, capability is another.

This is generally the sticking point.


I was just agreeing with what Thunder Monkey said, that not saying that it is a GPGPU in the leaked specs does not disprove the specs due to the fact that a modern AMD GPU could very safely be assumed to support GPGPU functions.

It has always been my argument that it supports GPGPU functions!
Just that unlike others I know what the entails, what it can do well, what it can not do well and (unlike you) I know that the 360's Xenos also supports GPGPU functions (that have been used in games).
Thus I know that people who are thinking that it sets the WiiU apart and are pinning hopes of it making up for a slow CPU are wrong!
Though you really need to stop acting like a dick man.

bg isn't stupid. He knows that those functions have existed for years. He just has a rosier outlook on the overall capabilities of the GPU in there. I'm hesitant to say that it will matter much.
 

TheD

The Detective
Functionality is one facet, capability is another.

This is generally the sticking point.



Though you really need to stop acting like a dick man.

bg isn't stupid. He knows that those functions have existed for years. He just has a rosier outlook on the overall capabilities of the GPU in there. I'm hesitant to say that it will matter much.


I am not the one acting like a dick.

BG attacked me for changing my argument, when I have done no such thing.

And he did just say that the WiiU is the only console that uses a GPGPU, which is just plain false, so I do not think he knows much about the topic.
 
I am not the one acting like a dick.

BG attacked me for changing my argument, when I have done no such thing.

And he did just say that the WiiU is the only console that uses a GPGPU, which is just plain false, so I do not think he knows much about the topic.

I think most everyone (who cares) knows general processing on the GPU has been around for awhile. And if it's being put to actual usage in a console for once, then it does deserve acknowledgment.

That doesn't read like someone saying it's never been done.

That reads like someone saying:
And if it's being put to actual usage in a console for once, then it does deserve acknowledgment.
which sounds dickish when you say:
Just that unlike others I know what that entails, what it can do well, what it can not do well and (unlike you)

I'm not going to say bg gets a pass for changing your argument. But damn man you need to learn to tone it down a notch or two. You can articulate a point without having to act like you're better right?

I'm not sold that the brute power is there for it to really take over general processing, making up for the lack of FP brute on the CPU. Or different SIMD functionality. But being designed around the concept is a slight bit different than just having the functionality.
 
Espresso


Wii+U+CPU.jpg




Even if this guy was wrong about something he knew something because no way his name was just going to happen to be the same as the CPU.

http://gamenmotion.blogspot.com/2012/06/rumor-wii-u-cpu-is-3-wii-cpu-cores.html



Edit: Now I would like an apology from everyone who called me crazy for believing in this person & treated me like I was a troll every time I brought this up.


imVQT.jpg
 

Durante

Member
But being designed around the concept is a slight bit different than just having the functionality.
I wonder to what extent it really is "designed around the concept" though. The single memory pool helps of course, but AMD GPUs prior of that era really weren't particularly good at GPGPU as soon as it wasn't 100% regular in both memory access patterns and control flow. Of course it hurts performance on all GPUs, but NV ones and GCN are quite a bit more resilient in that regard.
 

TheD

The Detective
That doesn't read like someone saying it's never been done.

That reads like someone saying:
which sounds dickish when you say:


I'm not going to say bg gets a pass for changing your argument. But damn man you need to learn to tone it down a notch or two. You can articulate a point without having to act like you're better right?

I'm not sold that the brute power is there for it to really take over general processing, making up for the lack of FP brute on the CPU. Or different SIMD functionality. But being designed around the concept is a slight bit different.

I am not going to act overly nice to a guy who just had a go at me out of the blue.


Nothing we have seen shows the WiiU CPU being tightly integrated with the GPU acting as it's SIMD unit, that is just something people have come up with in threads (AFAIK).

Even if it was, the R7xx arch is not crash hot for it, Nvidia has had a major lead in GPGPU up to GCN.
 
How good is the GPU meant to be in the Wii U? I'm so out of touch with this stuff nowadays.

Feature-wise it should be at least on the level of the 2008 R700-Design from AMD. Meaning Shader Model 4.1, Direct3D 10.1 on PC. Not really modern, but still quite a bit better than what we have in 360 and especially PS3 (Xenos and RSX).
Features alone don't tell much about performance though. We still don't know anything specific. The GPU could be just on par with Xenos, or 2, 3 or 4 times faster.

But being designed around the concept is a slight bit different than just having the functionality.

We don't know that though. The mention of GPGPU capability by Iwata could as well be just marketing. Seems to have worked on neogaf, everybody is talking about it now.
 
Saying the WiiU was "designed around GPGPU" sounds a lot lot an attempt at retconning the situation after the CPU began to appear weaker than what the 360 and PS3 have. I think there's a strong chance the PS4 and 720 literally are designed with that paradigm in mind, thanks to the huge leap AMD took with GPGPU performance in the GCN architecture and with their move towards Heterogenus System Architecture. But with the WiiU, using R700 era GPU tech and the CPU and GPU unlikely to actually share the same memory space? GPGPU doesn't seem like a practical or efficient use of the resources, let alone a magic bullet for any CPU shortcomings.
 

wabo

Banned
Saying the WiiU was "designed around GPGPU" sounds a lot lot an attempt at retconning the situation after the CPU began to appear weaker than what the 360 and PS3 have. I think there's a strong chance the PS4 and 720 literally are designed with that paradigm in mind, thanks to the huge leap AMD took with GPGPU performance in the GCN architecture and with their move towards Heterogenus System Architecture. But with the WiiU, using R700 era GPU tech and the CPU and GPU unlikely to actually share the same memory space? GPGPU doesn't seem like a practical or efficient use of the resources, let alone a magic bullet for any CPU shortcomings.

PS3 did it the other way round, with the CPU in charge of "GPU" tasks. It's just a matter of "who" is going to take full use of WiiU's capabilities when the twins are coming soon and bringing a true generational leap. Anyway, devkits as far as we know are a big piece of smelly s**t so I wouldn't expect any improvement do to good use of GPGPU at least til its second year.
 

beril

Member
Feature-wise it should be at least on the level of the 2008 R700-Design from AMD. Meaning Shader Model 4.1, Direct3D 10.1 on PC. Not really modern, but still quite a bit better than what we have in 360 and especially PS3 (Xenos and RSX).
Features alone don't tell much about performance though. We still don't know anything specific. The GPU could be just on par with Xenos, or 2, 3 or 4 times faster.



We don't know that though. The mention of GPGPU capability by Iwata could as well be just marketing. Seems to have worked on neogaf, everybody is talking about it now.

It's the one thing we know about the GPU from a 100% confirmed source. It's also one of the only hardware specs the company has publicly announced in the last decade, I'd say that alone makes it likely to be a quite important feature
 

magash

Member
It's the one thing we know about the GPU from a 100% confirmed source. It's also one of the only hardware specs the company has publicly announced in the last decade, I'd say that alone makes it likely to be a quite important feature

Exactly. I think the mere fact that Nintendo actually bothered to mention the capability is something that I find intriguing.
 
I wonder to what extent it really is "designed around the concept" though. The single memory pool helps of course, but AMD GPUs prior of that era really weren't particularly good at GPGPU as soon as it wasn't 100% regular in both memory access patterns and control flow. Of course it hurts performance on all GPUs, but NV ones and GCN are quite a bit more resilient in that regard.
I debated phrasing it that way, before settling on it.

Simple truth is... we don't really know. We know the underlying structure is based on the R700 series. But what changes or modifications done to it are a mystery still. Given the mention of it, I can't see it being completely throw away, but hell... this is Nintendo we're talking about. Their devs might still be amazed that physics can be calculated on the GPU. Considering they've worked with a GPU that doesn't even do basic Dot3 precision for 12 years.
I am not going to act overly nice to a guy who just had a go at me out of the blue.

Nothing we have seen shows the WiiU CPU being tightly integrated with the GPU acting as it's SIMD unit, that is just something people have come up with in threads (AFAIK).

Even if it was, the R7xx arch is not crash hot for it, Nvidia has had a major lead in GPGPU up to GCN.
Then act like an ass. Just don't be shocked that people are antagonistic towards you.
Feature-wise it should be at least on the level of the 2008 R700-Design from AMD. Meaning Shader Model 4.1, Direct3D 10.1 on PC. Not really modern, but still quite a bit better than what we have in 360 and especially PS3 (Xenos and RSX).
Features alone don't tell much about performance though. We still don't know anything specific. The GPU could be just on par with Xenos, or 2, 3 or 4 times faster.

We don't know that though. The mention of GPGPU capability by Iwata could as well be just marketing. Seems to have worked on neogaf, everybody is talking about it now.
Hell we don't even know it's clock speed, stream processing count. At this point saying either way takes more than a little guessing.

I don't expect miracles. Honestly, I don't know that I expect anything. I just know that the design wasn't finalized until January of this year. The GPU might look completely different by now. Functionality added or removed.
 

Durante

Member
It's the one thing we know about the GPU from a 100% confirmed source. It's also one of the only hardware specs the company has publicly announced in the last decade, I'd say that alone makes it likely to be a quite important feature
It's also one thing that is abundantly clear without any source at all. Unless they had decided to go with an overclocked Wii GPU, literally every sane choice they could make is capable of general purpose computation.
 
This reminds me:
Do we know anything about the console being region locked?

I know we all expect it to be, but we haven't heard anything from Nintendo thus far.
At least I didn't.

I really hope that Nintendo learned from the region lock fiasco that was the Wii and 3DS and makes Wii U region free, as it should be.
 
It's also one thing that is abundantly clear without any source at all. Unless they had decided to go with an overclocked Wii GPU, literally every sane choice they could make is capable of general purpose computation.

We should probably thank our lucky stars they didn't go with a DMP PICA GPU for the WiiU. ;)
 

Durante

Member
This reminds me:
Do we know anything about the console being region locked?

I know we all expect it to be, but we haven't heard anything from Nintendo thus far.
At least I didn't.

I really hope that Nintendo learned from the region lock fiasco that was the Wii and 3DS and makes Wii U region free, as it should be.
If Nintendo actually made it region free that would go a long way towards earning back my trust and respect. But they'll never do that, it's not in their corporate DNA.

Remember, this is the company that recently introduced region locking in one of their product lines.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
None of this:
Just that unlike others I know what that entails, what it can do well, what it can not do well and (unlike you) I know that the 360's Xenos also supports GPGPU functions (that have been used in games).

backs up this:

Thus I know that people who are thinking that it sets the WiiU apart and are pinning hopes of it making up for a slow CPU are wrong!

No GPGPU can 'make up' for a slow CPU in the general case. In specific cases, though, it can more than make up. Whether U-GPU has special provisions that tip the scales in favor of more efficient GPGPU that make it useful in more scenarios is unknown to us. Of course, if you know the details of the architecture, thread local store config (which, apropos, was R700's main problem when doing GPGPU, and not the VLIW architecture that people like to blame), etc, feel free to enlighten everybody. Until then the 'GPGPU was done on Xenos!' slogan has as much information value as 'U-GPU is GPGPU'.
 

Randdalf

Member
This probably isn't the right thread to ask this question, but there are so many Wii U threads right now that it's hard to find a good one for it.

31c28rbj1cL._AA300_.jpg


What is the flap on the front for? Gamecube ports? Or SD cards and USB ports?
 

magash

Member
This probably isn't the right thread to ask this question, but there are so many Wii U threads right now that it's hard to find a good one for it.

31c28rbj1cL._AA300_.jpg


What is the flap on the front for? Gamecube ports? Or SD cards and USB ports?

Maybe it is a small protrusion that is meant to stop people from trying to cover the vents on the side by tilting the the console on one side Like the original Wii.
 

onQ123

Member
Feature-wise it should be at least on the level of the 2008 R700-Design from AMD. Meaning Shader Model 4.1, Direct3D 10.1 on PC. Not really modern, but still quite a bit better than what we have in 360 and especially PS3 (Xenos and RSX).
Features alone don't tell much about performance though. We still don't know anything specific. The GPU could be just on par with Xenos, or 2, 3 or 4 times faster.



We don't know that though. The mention of GPGPU capability by Iwata could as well be just marketing. Seems to have worked on neogaf, everybody is talking about it now.

I was talking about it before Iwata said anything about a GPGPU being in the Wii U.


could Next Gen see 1 of the biggest improvements between console gens do to GPGPUs?

GPGPU Computing & why you should be more excited about Kinect 2 & the Next PS-Eye.
 

Ydahs

Member
Regarding GPGPU, the reason why it may be a benefit now moreso than previously is because it's becoming more standard. Not sure about OpenGL, but DirectX has it integrated within it's API starting with DirectX 10. This was in 2008 I think. Engines developed for the 360 and PS3 probably didn't take advantage of GPGPU functionality because it wasn't programming practice at the time. It'd have made life a whole lot more difficult when cross-platform titles were being developed, since it might fragment the market if older version of DirectX were abandoned at the time.

Now, even though WiiU isn't going to use DirectX or openGL, the engines developed will be built from the ground up based on common programming practises today. Developers will likely begin to allocate more 'general purpose' computations to the GPGPUs, with the comfort of knowing that all modern graphic cards and all modern graphical APIs will support certain calls.

That's my theory anyway.
 
I was just agreeing with what Thunder Monkey said, that not saying that it is a GPGPU in the leaked specs does not disprove the specs due to the fact that a modern AMD GPU could very safely be assumed to support GPGPU functions.

It has always been my argument that it supports GPGPU functions (I have no idea where you got the idea that it was not my argument).
Just that unlike others I know what that entails, what it can do well, what it can not do well and (unlike you) I know that the 360's Xenos also supports GPGPU functions (that have been used in games).
Thus I know that people who are thinking that it sets the WiiU apart and are pinning hopes of it making up for a slow CPU are wrong!

Could have sworn personal attacks were calling people names and what not. This it what I'm referring to.

http://www.neogaf.com/forum/showpost.php?p=40205173&postcount=3195

And this person would at least partially disagree, but that's also because he's being more flexible in the situation.

http://forum.beyond3d.com/showthread.php?p=1663966#post1663966

Functionality is one facet, capability is another.

This is generally the sticking point.

I agree. To me Nintendo isn't the type of company that would pursue this without addressing the known issues in some manner.

bg isn't stupid. He knows that those functions have existed for years. He just has a rosier outlook on the overall capabilities of the GPU in there. I'm hesitant to say that it will matter much.

Yep. I'm on the positive side in this till I have reason enough not to be.

I am not going to act overly nice to a guy who just had a go at me out of the blue.


Nothing we have seen shows the WiiU CPU being tightly integrated with the GPU acting as it's SIMD unit, that is just something people have come up with in threads (AFAIK).

Even if it was, the R7xx arch is not crash hot for it, Nvidia has had a major lead in GPGPU up to GCN.

Actually the person I've gotten GPU details from was very complimentary of the bus between the CPU and GPU.

See I don't have an issue with you and believe you know your stuff. What I have noticed though is that there is no "grey area" allowed when you post. It's "black and white" and when someone disagrees with you it becomes "extremely black and white". Like when it comes to the idea of customization. So when you don't accept the grey area there's only yes or no coming from you and no maybe, at all so far.

Saying the WiiU was "designed around GPGPU" sounds a lot lot an attempt at retconning the situation after the CPU began to appear weaker than what the 360 and PS3 have. I think there's a strong chance the PS4 and 720 literally are designed with that paradigm in mind, thanks to the huge leap AMD took with GPGPU performance in the GCN architecture and with their move towards Heterogenus System Architecture. But with the WiiU, using R700 era GPU tech and the CPU and GPU unlikely to actually share the same memory space? GPGPU doesn't seem like a practical or efficient use of the resources, let alone a magic bullet for any CPU shortcomings.

When the original target specs mention it, it's not doing a retcon. It's understanding what was going on especially when the first person to actually talk about relying on the GPU in that manner hasn't even shown himself to be a "Nintendo fan".

And if I understand what you are saying, then Wii (GC) is an indication that the CPU and GPU share memory. If I remember correctly blu told us the CPU could even access the embedded 1T-SRAM in the GPU. And they weren't designed to work in the same manner we are believing WiiU's CPU/GPU to work. Hopefully he can clear that up or someone else remembers better.
 
So what's the consensus now about the GPGPU and RAM? Haven't been in this thread since before the last ND.

I dont think its changed. 1GB RAM for devs with unknown speed. Weak GPU compared to whats on market and much less than double capability of Xenos/RSX. There's only but so much you can do with 45/75w machine.
 

LCGeek

formerly sane
I dont think its changed. 1GB RAM for devs with unknown speed. Weak GPU compared to whats on market and much less than double capability of Xenos/RSX. There's only but so much you can do with 45/75w machine.

I love the last line. If you think for the prices nintendo gave to buy system you were going to get top of the line when it's as much or more than system itself that's ridiculous. Also it's power not capability or features something that WiiU clearly has an edge over the other two.
 

beril

Member
I dont think its changed. 1GB RAM for devs with unknown speed. Weak GPU compared to whats on market and much less than double capability of Xenos/RSX. There's only but so much you can do with 45/75w machine.

If there's one thing nintendo never skimps on it's ram speed so I wouldn't worry about that
 

Rolf NB

Member
Btw, I've seen the 45/75W figures in the conference summary thread, but I'm missing some context (as in: why two numbers?). Anywhere I can find the exact statement? Is there some video stream I can watch, or a document I can download?
 
Btw, I've seen the 45/75W figures in the conference summary thread, but I'm missing some context (as in: why two numbers?). Anywhere I can find the exact statement? Is there some video stream I can watch, or a document I can download?

it was in the Japanese conference. The Japanese are very concious of low power consumption products. Well, someone has to be.
 
Btw, I've seen the 45/75W figures in the conference summary thread, but I'm missing some context (as in: why two numbers?). Anywhere I can find the exact statement? Is there some video stream I can watch, or a document I can download?

Wii U draws up to 75W of power, but averages a draw of 40W.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Btw, I've seen the 45/75W figures in the conference summary thread, but I'm missing some context (as in: why two numbers?). Anywhere I can find the exact statement? Is there some video stream I can watch, or a document I can download?
That's what ND JP literally gave - two figures: 75W for the PSU (which, incidentally, we already knew) and 45W 'normal', 'usual', or whatever term you'd use for what I call nominal draw (which people who knew the first figure were already guessing around). My original guess was low 50's, IIRC.

What kind of power envelope are you expecting for 45/75w? better than 14GFLOP/watt in the R700 line@40nm??
Where's that number coming from?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Don't think anything has been confirmed however officially 3ds only supports sdhc but does in fact seem to support sdxc
SDXC should be backward-compatible, IIRC. No idea what they do with any 'excess space' when in BC mode, though.
 
That's what ND JP literally gave - two figures: 75W for the PSU (which, incidentally, we already knew) and 45W 'normal', 'usual', or whatever term you'd use for what I call nominal draw (which people who knew the first figure were already guessing around). My original guess was low 50's, IIRC.


Where's that number coming from?

That number is a complete hypothetical based on the GFLOP/TDP performance of the 700 line, easily shown here (I know.. wikipedia bla bla bla, but i lost my original source from ArsTechnica)

from the chart, the best performer is the 4770 (R740 core), at 12 GFLOP/Watt, at a TDP of 80W for a GFLOP performance of 960.
 

ozfunghi

Member
That number is a complete hypothetical based on the GFLOP/TDP performance of the 700 line, easily shown here (I know.. wikipedia bla bla bla, but i lost my original source from ArsTechnica)

from the chart, the best performer is the 4770 (R740 core), at 12 GFLOP/Watt, at a TDP of 80W for a GFLOP performance of 960.


So basically, what you consider to be the general consensus, is some bad speculation you came up with on the spot. I think i speak for more people than just myself, when i say that is not the general consensus.
 

Rolf NB

Member
So basically, what you consider to be the general consensus, is some bad speculation you came up with on the spot. I think i speak for more people than just myself, when i say that is not the general consensus.
It's an upper bound. A useful tool to make reasonable predictions.
 
Yes, confirmed by Iwata.


IBM reps keep saying Power 7 based. "Leaks" points towards upgraded Broadway. is not clear what upgraded means in this context.

Official document via nintendo's press site:

IBM Power Based Multicore CPU

Doesn't really tell us much since Broadway is Power based as well and there is no mention of which power architecture... I'd say it's pretty safe to trust IBM's word on this since multiple official sources have said it and I'd think they'd know best on it. That said, "enhanced Broadway" could easily mean Power7 since as far as I know, Power series processors are fully backwards compatible.
 
Top Bottom