• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.

DCKing

Member
The WiiU will NOT use GCN technology.

Okay, bye again.
Based on what? Come on.

I'm not saying it does have it, but discounting it just because we're talking Nintendo is stupid. Both Flipper and Xenos used shading tech that was not available at the time of their release. ATI/AMD has done this both times it designed a new console GPU before.

@brain_stew: Good to see the mods were merciful :)
 
You can't discount anything with Nintendo.
Including the use of GCN.
They're going for small and lower power consumption.
GCN will have been out for a while by the time they need to go into production.

Not saying it's 100%. Just saying it is a possibility.
 

BurntPork

Banned
There's plenty of decent info burried in this thread, you just have to filter out the rubbish.

Oh and no, GCN isn't realistic. I wouldn't count on VLIW4/DX11/OpenGL 4.0 level tech either.

If you want a console with a GCN based GPU then you should be looking towards Microsoft and Sony.

I don't think we should count out DX11-level tech. I mean, Evergreen isn't that much different from R700. It's entirely realistic that some of Evergreen's features were added to the design. In fact, it would be far more shocking if that weren't the case.

You can't discount anything with Nintendo.
Including the use of GCN.
They're going for small and lower power consumption.
GCN will have been out for a while by the time they need to go into production.

Not saying it's 100%. Just saying it is a possibility.

8-10 months isn't "a while" in cases like this.
 
You can't discount anything with Nintendo.
Including the use of GCN.
They're going for small and lower power consumption.
GCN will have been out for a while by the time they need to go into production.

Not saying it's 100%. Just saying it is a possibility.
At this point I'd say it isn't.

Our in the knows have hinted at something around 2009 GPU standards. I doubt they'd say any different unless the kits are shooting for specs on that level.
 

DCKing

Member
ITT: People pretending GCN is some form of exotic magic. It's just another way to arrange the silicon, guys. It has been on AMD's drawing boards for years now, just like the Wii U GPU.
EDIT: @Xun: lol

The Wii U devkit is using a RV770LE. An obvious drop-in replacement for a GPU that is not ready. The suggestion that the GPU will be close to anything off-the-shelf AMD has made is ridiculous. The suggestion that it's close to off-the-shelf parts from two years ago even moreso. It is in no way unlikely that the Wii U will have a GPU with the same raw power as a RV770LE made using the latest design AMD has. To be honest, that should be exactly what Nintendo ordered.
 
It started off fairly reasonable - but now people are dragging it into a downwards spiral of insanity and nonsensical predictions that simply won't come true.

The lack of any solid evidence over the past few months has brought out the crazies, it seems.
You're not going to get solid evidence as people aren't going to risk their jobs. You just have to learn to read between the lines and work out which posters to trust.
 
At this point I'd say it isn't.

Our in the knows have hinted at something around 2009 GPU standards. I doubt they'd say any different unless the kits are shooting for specs on that level.

I'm not saying they'll be using a 7900 or something crazy.
Just that they could be going for a higher yield GPU in a 28NM shell. It would give them quite a bit of breathing room.
 

DCKing

Member
Our in the knows have hinted at something around 2009 GPU standards. I doubt they'd say any different unless the kits are shooting for specs on that level.
Don't be distracted by the fact the devkit has a RV770LE built in.

Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.

The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.

The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.
 
Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?

This is very intriguing to me.
 
Don't be distracted by the fact the devkit has a RV770LE built in.

Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.

The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.

The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.
There was nothing bleeding edge about the 3DS GPU. DMP had a ES 2.0 GPU available yet Nintendo choose an older design.
 

DCKing

Member
Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?

This is very intriguing to me.
I'm doubtful of this too. We're probably getting 40nm or 32nm for proper production volume. 45 nm is out of the question, but the CPU is announced to be on 45nm, produced by IBM.

One thing you have to remember though is that AMD and NVIDIA are stuck with TSMC (Taiwanese chip maker), and they have to produce a lower volume of larger chips. In principle, it's possible that Nintendo's usual chip maker, NEC, has 28nm facilities up and running. Given that NEC doesn't have the same commitments as TSMC has, it could give Nintendo an exclusive production line to focus just on the Wii U GPU. Added costs per for the new process per GPU could also signicantly be reduced: AMD orders not more than several 100k of huge 400+mm^2 chips, while Nintendo will order 8+ million of (likely) 80-130 mm^2 chips just for launch. The production process could be much more optimized for the Wii U GPU in theory.

So yeah, it's possible. But I doubt it's going to happen, because it could leave Nintendo short on Wii U GPUs for launch.

There was nothing bleeding edge about the 3DS GPU. DMP had a ES 2.0 GPU available yet Nintendo choose an older design.
Fair enough. For home consoles, it has always been a different story however.
 

wsippel

Banned
There was nothing bleeding edge about the 3DS GPU. DMP had a ES 2.0 GPU available yet Nintendo choose an older design.
Because it's far more powerful and efficient. DMP considers SMAPH-S the lower end solution, with the only benefit being that it's more standard-conform.
 

BurntPork

Banned
Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?

This is very intriguing to me.

Nintendo would probably use NEC, not TSMC, so everyone else's issues are irrelevant. It's still not gonna happen, though. Honestly, I really with this console were coming out in late 2013 instead...

No chance in hell. We'll see 45nm and nothing smaller.

You mean 40nm. Or, at least I hope you do. :/
 

Snakeyes

Member
Does DC stand for Damage Control? :p

Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.
 
Does DC stand for Damage Control? :p

Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.

Nintendo will also want to create a system that can compete and still get third party games...
They literally can't afford to be in the same situation as the Wii was. Because they will not hit it off big like the Wii did.
 

disap.ed

Member
Don't be distracted by the fact the devkit has a RV770LE built in.

Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.

The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.

The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.

Thx for that. I also think we will get something much more modern than the RV770 architecture and because there won't be a 28nm shrink of the VLIW architecture I in fact see the possibility for use of GCN.
 
Nintendo would probably use NEC, not TSMC, so everyone else's issues are irrelevant. It's still not gonna happen, though. Honestly, I really with this console were coming out in late 2013 instead.../
Never mind, safest bet, no 28nm for this GPU.

And yea, i agree Nintendo should have cook up this console for a while longer. Im aware of their Wii situation but i don't think they will capaitalize enough by been a year early maybe.

Nintendo has the 3DS to worry about also. Polishing the concept and releasing with a more capable console at an later date seems better right now.

Also unveiling the console more than a year away from it's release was really stupid.
 

BurntPork

Banned
Does DC stand for Damage Control? :p

Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.

And if that happens I won't be upset if it fails and gets Iwata fired.

To be honest, my gut feeling knows this will be hardly any different than this gen. Wii U will 80-120% above the current-gen depending on the scenario. Meanwhile, MS and Sony will have consoles that are 5x as powerful as it, and Wii U will only get the most basic ports done by c-teams, and will probably get most games several months late as well, while in other cases it'll get the same crappy, low-budget spin-offs on last-gen engines the Wii was stuck with, if it gets anything at all. I just know that there's a tiny chance that I'm wrong, and I'm clinging to it. If I'm right, however, I WANT to be crushing disappointed, so I'm keeping my hopes up.

Also unveiling the console more than a year away from it's release was really stupid.

They had no choice due to all of the leaks.
 

Azure J

Member
Can someone explain to me why GCN is such a big deal?I know it's new architecture and everyone is frothing over it to different extents, but what's the reason for it?

Also still sticking to this:

3.0GHz+ POWER7 based Tri/Quad Core CPU
1 - 1.5GB Main Memory (G/DDR3/5 is unknown)
32MB EDRAM "Mem1" Pool
640~800SPU AMD Graphics Card based on RV770(LE)

Anything further has been deemed false until deemed true. It's been my approach to guessing anything Nintendo this past generation and it's worked more times than it hasn't.
 
Can someone explain to me why GCN is such a big deal?I know it's new architecture and everyone is frothing over it to different extents, but what's the reason for it?

Also still sticking to this:

3.0GHz+ POWER7 based Tri/Quad Core CPU
1 - 1.5GB Main Memory (G/DDR3/5 is unknown)
32MB EDRAM "Mem1" Pool
640~800SPU AMD Graphics Card based on RV770(LE)

Anything further has been deemed false until deemed true. It's been my approach to guessing anything Nintendo this past generation and it's worked more times than it hasn't.

Smaller chip set means less power consumption and heat output.
 

PunchyBoy

Banned
Can someone explain to me why GCN is such a big deal?I know it's new architecture and everyone is frothing over it to different extents, but what's the reason for it?

Also still sticking to this:

3.0GHz+ POWER7 based Tri/Quad Core CPU
1 - 1.5GB Main Memory (G/DDR3/5 is unknown)
32MB EDRAM "Mem1" Pool
640~800SPU AMD Graphics Card based on RV770(LE)

Anything further has been deemed false until deemed true. It's been my approach to guessing anything Nintendo this past generation and it's worked more times than it hasn't.




So, I'll stick with:
3.0Ghz- POWER6 bases Tri Core CPU
768-1GB Main Memory (G/DDR3)
320~640SPU AMD Graphics based on RV730Pro/XT
 

DCKing

Member
Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.
Well they are skimping :) Although they are making a console that is likely very powerful compared to the Xbox 360 and PS3, it won't be impressive in terms of raw power when it launches. It seems to be turning out cheap, but at least it's modern and reasonably fast within the constraints it has.
 
They had no choice due to all of the leaks.
This is a really interesting way to look at the events BurntPork. Why do you think some more or less vague leaks force their hand to do a half assed unveil?

The leaks did generate lots of speculation, i, aswell as others, pieced together most of the functionality of the device but its not like anyone was certain.
 

Penguin

Member
This is a really interesting way to look at the events BurntPork. Why do you think some more or less vague leaks force their hand to do a half assed unveil?

The leaks did generate lots of speculation, i, aswell as others, pieced together most of the functionality of the device but its not like anyone was certain.

I always assumed they intended for it to be more of a revealing for the controller. Probably even without the sizzle reel, but once all the information was leaked in April, they thought had to present more.

I mean if not mistaken the Darksiders folks said they got the game running on the Wii U in like 5 weeks, which would be pretty close to the time table of the leak vs e3 presentation.

But just wild speculation.
 
This is a really interesting way to look at the events BurntPork. Why do you think some more or less vague leaks force their hand to do a half assed unveil?

The leaks did generate lots of speculation, i, aswell as others, pieced together most of the functionality of the device but its not like anyone was certain.
I was.

Seemed necessary. Wii was on the outs, rumors were everywhere. Was a definite. At least to meself.
 

Door2Dawn

Banned
Who gives a shit about a botched unveil for a console that will be released a year from now? Once you get closer to the launch date, you can talk more about it and generate more hype. People won't even remember last year.
 

disap.ed

Member
My guess for the GPU:

8-12 CUs (512-768SPs)
8-12 ROPs
32-48 TMUs

Clocks could be everything from ~500MHz to ~700MHz (depends on power budget)
 
I mean if not mistaken the Darksiders folks said they got the game running on the Wii U in like 5 weeks, which would be pretty close to the time table of the leak vs e3 presentation.
I saw that video too, for what i can remember Vigil guys broke their asses puting together a demo for E3 but at the last minute Nintendo decided not to show it.
Who gives a shit about a botched unveil for a console that will be released a year from now?
A very easy question to answer. Nintendo stock took a dive after that unveil, so they gave a shit alright as well as investors.

It's a good idea to reveal your next platform as close to release as possible since a portion of your base or potential one tends to lose interest in your current offering. Plus since the Wiiu concept seems not as hot as the Wii one they would have benefited massivly from more "real" and polished software, among some other things that's why a later unveil would have made lots of more sense.
 

AzaK

Member
Don't be distracted by the fact the devkit has a RV770LE built in.

Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.

The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.

The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.
Finally someone pushing our expectations up, not down ;)
 

MDX

Member
There are no commercial GPUs that are clocked at 1GHz stock. None. It's completely impossible. I'd make 700MHz the absolute limit, and even that's really pushing it.

If you go by rumors:

The Wii U will sport a souped-up Radeon 4000-series R700 GPU

AMD took the Radeon HD 4890, back in 2009, and clocked it up to 1GHz.

Sapphire was the first company to release a video card having clock speed of 1000 MHz (1 GHz) with the release of the Sapphire Atomic Edition HD 4890.

Since Nintendo is not sticking off-the-shelf products into their console, I say anything is possible.
 

Donnie

Member
And if that happens I won't be upset if it fails and gets Iwata fired.

To be honest, my gut feeling knows this will be hardly any different than this gen. Wii U will 80-120% above the current-gen depending on the scenario. Meanwhile, MS and Sony will have consoles that are 5x as powerful as it, and Wii U will only get the most basic ports done by c-teams, and will probably get most games several months late as well, while in other cases it'll get the same crappy, low-budget spin-offs on last-gen engines the Wii was stuck with, if it gets anything at all. I just know that there's a tiny chance that I'm wrong, and I'm clinging to it. If I'm right, however, I WANT to be crushing disappointed, so I'm keeping my hopes up.

But even IF your prediction on power is true (WiiU 1.8 to 2.2 times as powerful as 360 and PS4/Xbox-3 9 to 11 times 360) that still wouldn't be anywhere near the same hardware gap as last gen with Wii vs PS3/360 (360 is at least 10 times as powerful as Wii). Also as people have said here in the past Wii's old graphics architecture was the biggest reason it got so few of the big multi platform games, not power. A game designed for a GCN DX11 based GPU isn't going to be difficult to get running on a R700 DX10.1 based GPU (if WiiU's GPU is really based on that). I mean plenty of PC games will run on both, the same couldn't be said for games designed around DX9/10 and the DX7 level GPU in Wii.

My own prediction is that WiiU will be more like 3 times 360 and the next Sony/MS consoles will be 2-3 times as powerful as WiiU.
 

theBishop

Banned
Even IF your prediction on power is true (WiiU 1.8 to 2.2 times as powerful as 360 and PS4/Xbox-3 9 to 11 times 360) that still wouldn't be anywhere near the same hardware gap as last gen with Wii vs PS3/360 (360 is at least 10 times as powerful as Wii).

Mt own prediction is that WiiU will be more like 3 times 360 and the next Sony/MS consoles will be 2-3 times as powerful as WiiU.

Putting aside the inanity of "X times as powerful", do you think WiiU will be playing PS3/360-caliber games that look a bit better and run a bit smoother, or do you think it will play Loop/PS4 games that look significantly worse?
 

BurntPork

Banned
If you go by rumors:



AMD took the Radeon HD 4890, back in 2009, and clocked it up to 1GHz.



Since Nintendo is not sticking off-the-shelf products into their console, I say anything is possible.
First of all, we're not getting a 4890. Between yeilds (which is something that no amount of customization can fix) and power consumption, it can't happen. Not even Sony of Microsoft would aim for such a high clock. It's stupid, it's unreliable, it's hot, it's power hungry, it's not happening.

Second, factory overclock =/= stock clock. The stock 4890 is 850MHz. As a result, a 1GHz version would have MUCH lower yeilds. Unless the Wii U's size increases 3-4x and Nintendo has no problem launching with 10k units in each region, it's COMPLETELY impossible. Out of the window. There's no debate here. You might as well argue that it'll have a 7970.
 
You're not going to get solid evidence as people aren't going to risk their jobs. You just have to learn to read between the lines and work out which posters to trust.

Absolutely. Unless Nintendo or their partners announce anything themselves, we will never get credible information.
But it's when certain people go too far with rumours and speculation and formulate far-fetched predictions that we begin to get lost in explaining why they are wrong.

If we keep it focused and reasonable to start off with, then we may actually get closer in guessing the system's power.
 

DCKing

Member
Finally someone pushing our expectations up, not down ;)
To be honest, I'm not having really high expectations. GCN probably isn't more powerful than VLIW4 when it comes to gaming, and the 'power level' of the RV770LE is only good news if you compare it to the Xbox 360, but not so good when you compare it to what's possible in the next Xbox.

Just because it will be underpowered when compared to the competition, doesn't mean it has to be old hardware though. The reason I made that post is that I dislike how people think the Nintendo hardware designers as idiots, and the fact that even some educated people in here still think the final hardware will have a R700 or new technology that somehow ignores tesselators, VLIW4/GCN, or anything else that has come out in the last three years.
 

Donnie

Member
I think at first most of its games will look a bit better than 360/PS3 while some stand out titles will look noticeably better (higher res textures, better anti aliasing, better lighting better framerate). Once the new MS/Sony consoles come along it can then get downgraded versions of those systems games (similar difference between WiiU/PS4 version as I'm predicting for 360/WiiU).

Unless of course it does insanely well and we could than see a lot of WiiU up ports to the new MS/Sony consoles, but that's unlikely.
 
Status
Not open for further replies.
Top Bottom