Based on what? Come on.The WiiU will NOT use GCN technology.
Okay, bye again.
There's plenty of decent info burried in this thread, you just have to filter out the rubbish.
Oh and no, GCN isn't realistic. I wouldn't count on VLIW4/DX11/OpenGL 4.0 level tech either.
If you want a console with a GCN based GPU then you should be looking towards Microsoft and Sony.
You can't discount anything with Nintendo.
Including the use of GCN.
They're going for small and lower power consumption.
GCN will have been out for a while by the time they need to go into production.
Not saying it's 100%. Just saying it is a possibility.
At this point I'd say it isn't.You can't discount anything with Nintendo.
Including the use of GCN.
They're going for small and lower power consumption.
GCN will have been out for a while by the time they need to go into production.
Not saying it's 100%. Just saying it is a possibility.
Not even Nintendo is that stupid.
I edited my post before you posted.Wait, what would be stupid about that? 28nm yields?
You're not going to get solid evidence as people aren't going to risk their jobs. You just have to learn to read between the lines and work out which posters to trust.It started off fairly reasonable - but now people are dragging it into a downwards spiral of insanity and nonsensical predictions that simply won't come true.
The lack of any solid evidence over the past few months has brought out the crazies, it seems.
At this point I'd say it isn't.
Our in the knows have hinted at something around 2009 GPU standards. I doubt they'd say any different unless the kits are shooting for specs on that level.
Don't be distracted by the fact the devkit has a RV770LE built in.Our in the knows have hinted at something around 2009 GPU standards. I doubt they'd say any different unless the kits are shooting for specs on that level.
Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?
This is very intriguing to me.
Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?
This is very intriguing to me.
There was nothing bleeding edge about the 3DS GPU. DMP had a ES 2.0 GPU available yet Nintendo choose an older design.Don't be distracted by the fact the devkit has a RV770LE built in.
Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.
The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.
The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.
I'm doubtful of this too. We're probably getting 40nm or 32nm for proper production volume. 45 nm is out of the question, but the CPU is announced to be on 45nm, produced by IBM.Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?
This is very intriguing to me.
Fair enough. For home consoles, it has always been a different story however.There was nothing bleeding edge about the 3DS GPU. DMP had a ES 2.0 GPU available yet Nintendo choose an older design.
Because it's far more powerful and efficient. DMP considers SMAPH-S the lower end solution, with the only benefit being that it's more standard-conform.There was nothing bleeding edge about the 3DS GPU. DMP had a ES 2.0 GPU available yet Nintendo choose an older design.
Will Nintendo have 28nm GPUs in their fall console release when AMD and Nvidia are having serious problems migrating to that process?
This is very intriguing to me.
No chance in hell. We'll see 45nm and nothing smaller.
Does DC stand for Damage Control?
Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.
Don't be distracted by the fact the devkit has a RV770LE built in.
Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.
The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.
The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.
Never mind, safest bet, no 28nm for this GPU.Nintendo would probably use NEC, not TSMC, so everyone else's issues are irrelevant. It's still not gonna happen, though. Honestly, I really with this console were coming out in late 2013 instead.../
Does DC stand for Damage Control?
Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.
Also unveiling the console more than a year away from it's release was really stupid.
Can someone explain to me why GCN is such a big deal?I know it's new architecture and everyone is frothing over it to different extents, but what's the reason for it?
Also still sticking to this:
3.0GHz+ POWER7 based Tri/Quad Core CPU
1 - 1.5GB Main Memory (G/DDR3/5 is unknown)
32MB EDRAM "Mem1" Pool
640~800SPU AMD Graphics Card based on RV770(LE)
Anything further has been deemed false until deemed true. It's been my approach to guessing anything Nintendo this past generation and it's worked more times than it hasn't.
Can someone explain to me why GCN is such a big deal?I know it's new architecture and everyone is frothing over it to different extents, but what's the reason for it?
Also still sticking to this:
3.0GHz+ POWER7 based Tri/Quad Core CPU
1 - 1.5GB Main Memory (G/DDR3/5 is unknown)
32MB EDRAM "Mem1" Pool
640~800SPU AMD Graphics Card based on RV770(LE)
Anything further has been deemed false until deemed true. It's been my approach to guessing anything Nintendo this past generation and it's worked more times than it hasn't.
Well they are skimping Although they are making a console that is likely very powerful compared to the Xbox 360 and PS3, it won't be impressive in terms of raw power when it launches. It seems to be turning out cheap, but at least it's modern and reasonably fast within the constraints it has.Seriously though, I like your optimism but Nintendo will almost certainly want to skimp on the innards of the console to offset the price of the controller.
this console is going to do hd graphics right?
Nah man, it all depends on how many Gamecubes you feed it.
This is a really interesting way to look at the events BurntPork. Why do you think some more or less vague leaks force their hand to do a half assed unveil?They had no choice due to all of the leaks.
This is a really interesting way to look at the events BurntPork. Why do you think some more or less vague leaks force their hand to do a half assed unveil?
The leaks did generate lots of speculation, i, aswell as others, pieced together most of the functionality of the device but its not like anyone was certain.
I was.This is a really interesting way to look at the events BurntPork. Why do you think some more or less vague leaks force their hand to do a half assed unveil?
The leaks did generate lots of speculation, i, aswell as others, pieced together most of the functionality of the device but its not like anyone was certain.
Just so you know, that wasn't my prediction. I was quoting that.
I saw that video too, for what i can remember Vigil guys broke their asses puting together a demo for E3 but at the last minute Nintendo decided not to show it.I mean if not mistaken the Darksiders folks said they got the game running on the Wii U in like 5 weeks, which would be pretty close to the time table of the leak vs e3 presentation.
A very easy question to answer. Nintendo stock took a dive after that unveil, so they gave a shit alright as well as investors.Who gives a shit about a botched unveil for a console that will be released a year from now?
Finally someone pushing our expectations up, not downDon't be distracted by the fact the devkit has a RV770LE built in.
Nintendo doesn't go out for a contract of 50+ million GPUs for a gaming console to be served a watered down chip from 2009. The chip they will get will use the latest technology that is practically possible in the design and manufacturing process. Given that the Wii U will launch after GCN firmly puts it in that area. Compare to the Xbox 360, that launched a full year before unified shaders came to PCs, and still used a (rudimentary) version of that tech. It was NVIDIA who released the unified shaders first, and it would in fact take 18 months until ATI released their own unified shader GPU after the Xbox 360.
The idea that this is not the case, would be ridiculed if it were suggested in a Sony or Microsoft thread. However, Nintendo is somehow a flawed company that seems satisfied with 2009 tech, because it's 'safe' and 'proven'. This is not how the gaming industry works, and Nintendo knows that. The GPUs Nintendo put out in the N64, Cube and 3DS were all bleeding edge in their own ways, as that was the most competitive thing to do. Even if that were not the case, Nintendo will see that using last generation technology has hurt them with the Wii. It is not the 'safest' route to go by any means.
The fact that the RV770LE is absolutely unsuitable for a small console, even at 28nm, means there is a change coming. The final GPU will be only comparable to the RV770LE in terms of power and basic capabilities, but will be vastly different.
There are no commercial GPUs that are clocked at 1GHz stock. None. It's completely impossible. I'd make 700MHz the absolute limit, and even that's really pushing it.
The Wii U will sport a souped-up Radeon 4000-series R700 GPU
Sapphire was the first company to release a video card having clock speed of 1000 MHz (1 GHz) with the release of the Sapphire Atomic Edition HD 4890.
Ugh.
And if that happens I won't be upset if it fails and gets Iwata fired.
To be honest, my gut feeling knows this will be hardly any different than this gen. Wii U will 80-120% above the current-gen depending on the scenario. Meanwhile, MS and Sony will have consoles that are 5x as powerful as it, and Wii U will only get the most basic ports done by c-teams, and will probably get most games several months late as well, while in other cases it'll get the same crappy, low-budget spin-offs on last-gen engines the Wii was stuck with, if it gets anything at all. I just know that there's a tiny chance that I'm wrong, and I'm clinging to it. If I'm right, however, I WANT to be crushing disappointed, so I'm keeping my hopes up.
Even IF your prediction on power is true (WiiU 1.8 to 2.2 times as powerful as 360 and PS4/Xbox-3 9 to 11 times 360) that still wouldn't be anywhere near the same hardware gap as last gen with Wii vs PS3/360 (360 is at least 10 times as powerful as Wii).
Mt own prediction is that WiiU will be more like 3 times 360 and the next Sony/MS consoles will be 2-3 times as powerful as WiiU.
First of all, we're not getting a 4890. Between yeilds (which is something that no amount of customization can fix) and power consumption, it can't happen. Not even Sony of Microsoft would aim for such a high clock. It's stupid, it's unreliable, it's hot, it's power hungry, it's not happening.If you go by rumors:
AMD took the Radeon HD 4890, back in 2009, and clocked it up to 1GHz.
Since Nintendo is not sticking off-the-shelf products into their console, I say anything is possible.
You're not going to get solid evidence as people aren't going to risk their jobs. You just have to learn to read between the lines and work out which posters to trust.
To be honest, I'm not having really high expectations. GCN probably isn't more powerful than VLIW4 when it comes to gaming, and the 'power level' of the RV770LE is only good news if you compare it to the Xbox 360, but not so good when you compare it to what's possible in the next Xbox.Finally someone pushing our expectations up, not down