Though I'm kind of scared that they won't show anything new half a year after E3, even if the Wii U will be a CES... unless they are keeping the "new" version of the Wii U for E3 or something.
And I like how it says it will release between THE START of E3 and the end of 2012.
I would be SHOCKED if they had anything playable. I expect the exact same presentation if not LESS. Just kind of a reminder "Hey, remember the Wii-U? Cool you'll see more at e3, here's the fucking bird demo again".
Though I'm kind of scared that they won't show anything new half a year after E3, even if the Wii U will be a CES... unless they are keeping the "new" version of the Wii U for E3 or something.
And I like how it says it will release between THE START of E3 and the end of 2012.
I would be SHOCKED if they had anything playable. I expect the exact same presentation if not LESS. Just kind of a reminder "Hey, remember the Wii-U? Cool you'll see more at e3, here's the fucking bird demo again".
We are talking about die sizes. I know you know this. So what happens when you design for a smaller die size? You can do one of two things...get more power per watt, or use less watts for a given design. If Nintendo chose 40 nm, the GPU becomes more limited than at 28 nm. Either it will be much slower, or just a straight out less sophisticated design (less trannies etc). At 28 nm that leaves room to speculate that Nintendo is trying to squeeze a more powerful chip into a more limited case (since it will use less power)...or it may just use a conservative design that requires a small, silent fan. I'm hoping for the former.
BS. A ~$349 console with a blue laser drive, unified shader tech, wireless tablets and tons of EDRAM would be absolutely out of the question in 2006.
People should pretending like this is last-gen technology, just because it's not a powerhouse. Nintendo will do the best they can for the small size and low price.
What they are doing wrong is they're really fucking op the PR. This isn't panic mode Nintendo, this is panic mode GAF.
We are talking about die sizes. I know you know this. So what happens when you design for a smaller die size? You can do one of two things...get more power per watt, or use less watts for a given design. If Nintendo chose 40 nm, the GPU becomes more limited than at 28 nm. Either it will be much slower, or just a straight out less sophisticated design (less trannies etc). At 28 nm that leaves room to speculate that Nintendo is trying to squeeze a more powerful chip into a more limited case (since it will use less power)...or it may just use a conservative design that requires a small, silent fan. I'm hoping for the former.
BS. A ~$349 console with a blue laser drive, unified shader tech, wireless tablets and tons of EDRAM would be absolutely out of the question in 2006.
People should pretending like this is last-gen technology, just because it's not a powerhouse. Nintendo will do the best they can for the small size and low price.
What they are doing wrong is they're really fucking op the PR. This isn't panic mode Nintendo, this is panic mode GAF.
Yes, at $349 it would not have been feasible. But I'm willing to bet everything in the system was available in 2006. Nothing shocking about the GPU, CPU, RAM. Nothing that wasn't out then. And if they dropped the tablet controller, they may have released something very similar though at a slightly higher price point than $349 (well, blu-ray would have driven the price up more, but hey blu-ray doesn't affect raw performance of the chipset). Performance would be like an Xbox 360 on steroids.
Was wireless streaming technology there in 2006? Yes. Was blue laser disk tech there in 2006? Yes. Was unified shader tech there in 2006? Yes. Were high end POWER chips there in 2006? Yes. Hmm, it seems everything in the PS4 and Xbox Next was available in 2006 too. That sucks.
Also, I wasn't sure about 28nm but with AMD apparently readying a $100 28nm Graphics Card already(Cape Verde is the codename I believe) I'm moving towards the believers camp.
I'm starting to lean towards wsippel's suggestion that AMD basically sold Nintendo a base design for the GPU back in 09 or whenever and haven't had much input since. Which process it ends up on will be up to NTD and NEC most likely.
Was wireless streaming technology there in 2006? Yes. Was blue laser disk tech there in 2006? Yes. Was unified shader tech there in 2006? Yes. Were high end POWER chips there in 2006? Yes. Hmm, it seems everything in the PS4 and Xbox Next was available in 2006 too. That sucks.
Let me rephrase. The technology was there to recreate everything we saw at E3 from Nintendo, and anything else I expect to be shown at next year's E3 in a console with a reasonable price (sans controller, I am excluding that wonderful gimmick). I hope to be proven wrong. I do not believe the technology existed in 2006 to recreate what I expect from the next Xbox or Playstation.
I'm starting to lean towards the idea that AMD sold Nintendo a design for the GPU back in 09 or whenever and haven't had much input since. Which process it ends up on will be up to NTD and NEC most likely.
Most likely IBM. As I wrote a while ago, an engineer at IBM's Entertainment Processor division was working on a "45nm CPU for a next generation gaming console" before starting to work on a 32nm GPU VLSI project. The CPU mentioned was almost certainly the Wii U CPU, and IBM usually keeps the console teams seperated, so whatever GPU he was working on is most likely related as well.
Let me rephrase. The technology was there to recreate everything we saw at E3 from Nintendo, and anything else I expect to be shown at next year's E3 in a console with a reasonable price (sans controller, I am excluding that wonderful gimmick). I hope to be proven wrong. I do not believe the technology existed in 2006 to recreate what I expect from the next Xbox or Playstation.
Of course. But that's such a moot point. Everything was there in 2006 to recreate what the PS4 and next Xbox will have as well. It just doesn't count if you can't put it in a small box and sell it for a couple of hundred bucks.
Let me rephrase. The technology was there to recreate everything we saw at E3 from Nintendo, and anything else I expect to be shown at next year's E3 in a console with a reasonable price (sans controller, I am excluding that wonderful gimmick). I hope to be proven wrong. I do not believe the technology existed in 2006 to recreate what I expect from the next Xbox or Playstation.
Of course. But that's such a moot point. Everything was there in 2006 to recreate what the PS4 and next Xbox will have as well. It just doesn't count if you can't put it in a small box and sell it for a couple of hundred bucks.
The Wii U, with the exception of the controller could have been created in 2006 in a case the size of the Xbox 360 for a few hundred dollars, in terms of the performance I'm expecting. It will be an Xbox 360 on steroids. The next-gen machines will be substantially more powerful than that. In 2006 would have been impossible to replicate that in a case the size of the Xbox 360 and be under a thousand dollars. Yes I am cheating a slight bit by nixing the controller, I know its an important part of the Wii U, but the point is really about graphics performance. In terms of graphic performance, its like the perfect 2006 console. Xbox 3 or PS4 will be true next-gen systems.
The GPU isn't a performance leader from '09. Its all about getting more efficiency from an even older architecture. Not trolling, I'm trying to lower everyone's expectations! If we get more, than it will be a bigger nerdgasm!
Most likely IBM. As I wrote a while ago, an engineer at IBM's Entertainment Processor division was working on a "45nm CPU for a next generation gaming console" before starting to work on a 32nm GPU VLSI project. The CPU mentioned was almost certainly the Wii U CPU, and IBM usually keeps the console teams seperated, so whatever GPU he was working on is most likely related as well.
No. And I'm not sure any game uses it. The main application seem to be terrain engines, and if Skyrim for example uses that approach, they certainly don't seem to make the most of it. But they are heavily constrained by mass storage limitations as well.
Let me rephrase. The technology was there to recreate everything we saw at E3 from Nintendo, and anything else I expect to be shown at next year's E3 in a console with a reasonable price.
'Everything we saw' is ultra subjective. If we, though, took you statement in the context of this thread's more reasonable expectations, said statement is factually untrue.
ATI's consumer line of R600 GPUs launched in 2007, and there was no GPU with more than 320 SPs up until Q2 2008.
DDR3 appeared in GPUs in mid-2008, and I'm not even touching on GDDR5 which was not available until late 2009.
POWER7 is 2010 tech, along with IBM's patented edram tech.
You must have had some really deep pockets to get that level of performance back in 2006.
'Everything we saw' is ultra subjective. If we, though, took you statement in the context of this thread's more reasonable expectations, said statement is factually untrue.
ATI's consumer line of R600 GPUs launched in 2007, and there was no GPU with more than 320 SPs up until Q2 2008.
DDR3 appeared in GPUs in mid-2008, and I'm not even touching on GDDR5 which was not available until late 2009.
POWER7 is 2010 tech, along with IBM's patented edram tech.
You must have had some really deep pockets to get that level of performance back in 2006.
The GPU isn't a performance leader from '09. Its all about getting more efficiency from an even older architecture. Not trolling, I'm trying to lower everyone's expectations! If we get more, than it will be a bigger nerdgasm!
Ha! Fair enough. But remember things like the eDRAM and whatever other tweaks Nintendo/IBM have made to the design could make a huge difference. Architecture is fun for techies to discuss, but at this point I really only care about raw performance. GCN is a way for AMD to improve efficiency. I'm sure Nintendo have also improved efficiency in their own ways.
Edit:
@wsippel: Gotcha. Further thanks for your contributions.
Ha! Fair enough. But remember things like the eDRAM and whatever other tweaks Nintendo/IBM have made to the design could make a huge difference. Architecture is fun for techies to discuss, but at this point I really only care about raw performance. GCN is a way for AMD to improve efficiency. I'm sure Nintendo have also improved efficiency in their own ways.
They are using a high tech approach to get '06 numbers on the cheap. This is where the advanced technology is coming in. It won't even be half as powerful as the competitors. PROVE ME WRONG NINTENDO.
Yes, at $349 it would not have been feasible. But I'm willing to bet everything in the system was available in 2006. Nothing shocking about the GPU, CPU, RAM. Nothing that wasn't out then. And if they dropped the tablet controller, they may have released something very similar though at a slightly higher price point than $349 (well, blu-ray would have driven the price up more, but hey blu-ray doesn't affect raw performance of the chipset). Performance would be like an Xbox 360 on steroids.
Neither did any of the parts of the Ipad. I guess that performance was impossible to see back then. No way a game as good looking as Infinity Blade was possible. The chips didn't exist then. Nope.
The Wii U, with the exception of the controller could have been created in 2006 in a case the size of the Xbox 360 for a few hundred dollars, in terms of the performance I'm expecting. It will be an Xbox 360 on steroids. The next-gen machines will be substantially more powerful than that. In 2006 would have been impossible to replicate that in a case the size of the Xbox 360 and be under a thousand dollars. Yes I am cheating a slight bit by nixing the controller, I know its an important part of the Wii U, but the point is really about graphics performance. In terms of graphic performance, its like the perfect 2006 console. Xbox 3 or PS4 will be true next-gen systems.
You're not only cheating by omitting the controller, but also by defining the Xbox 360 case as your starting point. Secondly, it would still be impossible. Let's assume the RV770LE is the benchmark for the Wii U's performance, and the final unit won't be any better. The RV770LE has fillrate numbers that more than double 360's numbers. It can do more than twice the number of pixels in any whether they are texels or something else, and although I can't find the RV770's vertex performance, you can be quite certain it also demolishes the 360 in that area as well. Not enough to be a complete generational leap, but still much more powerful. Lastly, the power of 640 SPUs @ 575 MHz compared to 192 SPUs @ 500 MHz means that the projected Wii U GPU can do 3.8 times as many shader operations as the 360.
It's interesting to note that the Wii U's potential shading power exceeds that of 429 GameCubes duct taped together
You shouldn't underestimate what developers are using to build Wii U games.
Now, in 2006 ATI's very best offering was still the Radeon X1950XTX. It would take a full year (Nov 2007) for ATI to release a competitive unified shader GPU, which was still had basically half the capacity of what the RV770LE had. The RV770LE has a chip TDP of probably 70W, exceeding that of the 360, but that's at a 55nm process node unavailable in 2006. If we're talking about putting that power in a console in 2006, we're talking about a 100+W chip based on design that simply didn't exist. You could argue that the 2006 Nvidia GeForce 8800GTX has enough power to match the RV770LE in power, but that one cost $400+ and tapped 155W as a PC card. And it was still somewhat less powerful, and did not implement all DX10.1 features.
In other words: a GPU power of the Wii U would be impossible to do in a 2006 console. The first GPU that would be able to deliver the same performance as the Wii U GPU in an Xbox sized console is the RV740 chip, released in 2009. That could deliver RV770LE like performance for some ~50W. However, you can bet that the Wii U uses modern tesselation and more modern shaders, as opposed to both the RV770LE and RV740. So you're looking in the direction of the Juniper Pro chip. The first point at which I think a console with the GPU of the Wii U could be possible in an Xbox 360 case would be early 2010. It would be hella expensive, and RRoD all the time, but it could happen.
And we're omitting a potentially POWER7 related CPU, the massive amounts of eDRAM that would be huge on 2006 processes, a blue laser drive and voodoo wireless display tech. And we're not even talking about a Wii U sized box. Nintendo is truly being cutting edge here, it's just not the cutting edge we might like best.
You're not only cheating by omitting the controller, but also by defining the Xbox 360 case as your starting point. Secondly, it would still be impossible. Let's assume the RV770LE is the benchmark for the Wii U's performance, and the final unit won't be any better. The RV770LE has fillrate numbers that more than double 360's numbers. It can do more than twice the number of pixels in any whether they are texels or something else, and although I can't find the RV770's vertex performance, you can be quite certain it also demolishes the 360 in that area as well. Not enough to be a complete generational leap, but still much more powerful. Lastly, the power of 640 SPUs @ 575 MHz compared to 192 SPUs @ 500 MHz means that the projected Wii U GPU can do 3.8 times as many shader operations as the 360.
It's interesting to note that the Wii U's potential shading power exceeds that of 429 GameCubes duct taped together
You shouldn't underestimate what developers are using to build Wii U games.
Now, in 2006 ATI's very best offering was still the Radeon X1950XTX. It would take a full year (Nov 2007) for ATI to release a competitive unified shader GPU, which was still had basically half the capacity of what the RV770LE had. The RV770LE has a chip TDP of probably 70W, exceeding that of the 360, but that's at a 55nm process node unavailable in 2006. If we're talking about putting that power in a console in 2006, we're talking about a 100+W chip based on design that simply didn't exist. You could argue that the 2006 Nvidia GeForce 8800GTX has enough power to match the RV770LE in power, but that one cost $400+ and tapped 155W as a PC card. And it was still somewhat less powerful, and did not implement all DX10.1 features.
In other words: a GPU power of the Wii U would be impossible to do in a 2006 console. The first GPU that would be able to deliver the same performance as the Wii U GPU in an Xbox sized console is the RV740 chip, released in 2009. That could deliver RV770LE like performance for some ~50W. However, you can bet that the Wii U uses modern tesselation and more modern shaders, as opposed to both the RV770LE and RV740. So you're looking in the direction of the Juniper Pro chip. The first point at which I think a console with the specs of the Wii could be possible in an Xbox 360 case would be early 2010. It would be hella expensive, and RRoD all the time, but it could happen.
And we're omitting a potentially POWER7 related CPU, the massive amounts of eDRAM that would be huge on 2006 processes, a blue laser drive and voodoo wireless display tech. And we're not even talking about a Wii U sized box. Nintendo is truly being cutting edge here, it's just not the cutting edge we might like best.
Ultimately, I expect games that look an awful lot like Xbox 360 games. Sure, higher resolution and IQ that the Xbox 360 can't handle...but still something that looks like an Xbox 360 on roids. I do not think a 2 or 3x jump will be enough to say "holy sheeit impossibru on Xbox 360". An '06 system on roids. I'm not saying you are wrong. These are my expectations and I want to be proven wrong. I WANT TO BE.
Ultimately, I expect games that look an awful lot like Xbox 360 games. Sure, higher resolution and IQ that the Xbox 360 can't handle...but still something that looks like an Xbox 360 on roids. I do not think a 2 or 3x jump will be enough to say "holy sheeit impossibru on Xbox 360". An '06 system on roids. I'm not saying you are wrong. These are my expectations and I want to be proven wrong. I WANT TO BE.
There will be games that clearly aren't possible on the 360. Hell, there are wii games that wouldn't be possible on the GC, though it's not terribly noticeable.
Ultimately, I expect games that look an awful lot like Xbox 360 games. Sure, higher resolution and IQ that the Xbox 360 can't handle...but still something that looks like an Xbox 360 on roids. I do not think a 2 or 3x jump will be enough to say "holy sheeit impossibru on Xbox 360". An '06 system on roids. I'm not saying you are wrong. These are my expectations and I want to be proven wrong. I WANT TO BE.
That's nonsense. My laptop has a GPU (it's a Turks chip) less powerful than the RV770LE and I can play Skyrim at 'high quality' settings on 1080p (with a much better framerate than the PS3 version ). Trust me that it looks a lot better than what you see on the PS3 and 360. And that's a GPU that has 3/4 the shaders and 1/2 the ROPs/TMUs of the RV770LE. Even the RV730 we thought that was in the Wii U before can play CoD 4 at 1080p where the 360 could only do 600p. Now consider that these games were not even tailor made for these GPUs, as opposed to the 360 and PS3 versions.
Modern stuff is just better. If we'll see bland ports, that will be developer laziness, not a lack of power. The only thing we should can truly worry about is the gap between the Wii U and the PS4 / Xbox Next, but not even that gap will stop it from having ports.
There will be a slew of generic looking XBOX360-esque ports and titles at first release, but leave it to Nintendo and their trusted parties to bring the real eye candy.
They're best at pushing the hardware - think Galaxy, Metroid etc.
Now apply that approach to a system, even if it is as powerful as a 360, to the WiiU - and you're getting Uncharted 3 level graphics for Nintendo's first party games.
Now add on top of that the fact we know the system is MORE powerful... and the difference will be, ultimately, huge - especially when compared to the power of the Wii.
That's nonsense. My laptop has a GPU (it's a Turks chip) less powerful than the RV770LE and I can play Skyrim at 'high quality' settings on 1080p (with a much better framerate than the PS3 version ). Trust me that it looks a lot better than what you see on the PS3 and 360. And that's a GPU that has 3/4 the shaders and 1/2 the ROPs/TMUs of the RV770LE. Even the RV730 we thought that was in the Wii U before can play CoD 4 at 1080p where the 360 could only do 600p. Now consider that these games were not even tailor made for these GPUs, as opposed to the 360 and PS3 versions.
Modern stuff is just better. If we'll see bland ports, that will be developer laziness, not a lack of power. The only thing we should can truly worry about is the gap between the Wii U and the PS4 / Xbox Next, but not even that gap will stop it from having ports.
I play Skyrim on all high setting, and have seen Skyrim on 360 in person and can play it right now if I wish, but I've seen enough. Its not enough of a difference. I doubt we'll see anything that makes the 360 or PS3 look laughably out of date...like most generational transitions do.
Neither did any of the parts of the Ipad. I guess that performance was impossible to see back then. No way a game as good looking as Infinity Blade was possible. The chips didn't exist then. Nope.
Well, if you keep raising the bar like this, we'll agree at some point
This 'laughably out of date' is subjective. If you think high-end PC games make PS3/360 games look 'laughably out of date' then it's likely you will see that on Wii U, considering an optimized game for the system (especially with 2 GB RAM, just do it Nintendo). If you're thinking PS2 -> PS3 jump, then no, that's not going to happen.
I've made the comparison before, I think the Wii U will compare to the Xbox 360 and PS3 like the GameCube compared to the DreamCast. Not a true generational leap, but there's clearly more raw power involved and much more graphical effects that are possible.
Though I'm kind of scared that they won't show anything new half a year after E3, even if the Wii U will be a CES... unless they are keeping the "new" version of the Wii U for E3 or something.
And I like how it says it will release between THE START of E3 and the end of 2012.
As crazy as it may seem, a surprise "The console's in stores tomorrow!" isn't completely impossible. The oft-quoted Saturn debacle was largely a last-minute decision to get out ahead of the PlayStation, which meant there were barely any games available for months after launch, so the early release was seen as a failure. Nintendo, however, seem to have been targeting a mid-2012 launch for quite a while, so shouldn't have a problem making sure there's software there on and after launch day. They also haven't shown a single first-party title for the console yet, which would give them the opportunity for an Apple-style strategy of turning hype into sales immediately, rather than let the hype die down, or become diluted by XBox3 or PS4 announcements, before launch.
Of course, for that to work, they'd really need the big games out at launch, including both casual and dudebro-style heavy hitters. A look at the teams that would have been free for the 18-24 month timescale necessary would indicate a Mario game (likely NSMB, but possibly 3D, teams for both have been free for a while), and a Retro game (hopefully a graphical showcase FPS with substantial online component) could be possibilities, along with the usual assortment of 3rd party titles.
Of course, I don't expect this to actually happen, but it's interesting to consider as a possibility.
The claim I make is that the performance that I expect from the Wii U, pure performance, is what I would expect from a "perfect" '06 generation machine. It will provide nothing more. I don't care that the specific chips did not exist to make an identical unit for the identical price it will come out at. Yes, it is using technology which is far more advanced than the previous generation, just so it can be a step up and not a leap beyond them. Its being used for greater efficiency. The chips for the Ipad didn't exist in '06, but there was technology that could replicate its graphics just fine. The chips for the Wii U did not exist in '06, but I think that level of technology could have been feasibly replicated. I expect nothing more than a beefy, roided up '06 machine (the 360 was '05 sure, but hey the PS3 came out in '06 and could have been more powerful if they made different decisions about its innards).
That's nonsense. My laptop has a GPU (it's a Turks chip) less powerful than the RV770LE and I can play Skyrim at 'high quality' settings on 1080p (with a much better framerate than the PS3 version ). Trust me that it looks a lot better than what you see on the PS3 and 360. And that's a GPU that has 3/4 the shaders and 1/2 the ROPs/TMUs of the RV770LE. Even the RV730 we thought that was in the Wii U before can play CoD 4 at 1080p where the 360 could only do 600p. Now consider that these games were not even tailor made for these GPUs, as opposed to the 360 and PS3 versions.
Modern stuff is just better. If we'll see bland ports, that will be developer laziness, not a lack of power. The only thing we should can truly worry about is the gap between the Wii U and the PS4 / Xbox Next, but not even that gap will stop it from having ports.
If what you're saying in the prev post about Xenon 360 w/ ~60-70w TDP for GPU is correct, then how much gap could there really be between a theoretical WiiU GPU and whatever Xbox/PS4 releases? TSMC's timeline for 20nm chips is late 2013-2014 so depending on high-performance chips from that process seems unrealistic (unless they plan on giving WiiU 2 years of head start), and minus that we're only looking at a few years' advancement in design. That'd probably mean a decent bump in performance from streamlined design and probably a higher-powered chip, and a bag of new tricks from DX11. If 360/PS3 -> WiiU is 'not that huge' it would seem WiiU -> PS4/Xbox would be 'not that huge' either. In either case you wouldn't be anywhere near the performance of Crossfire/SLI 300W monster graphics cards, which seems to be what people are expecting now out of the incoming 'next gen.'
As crazy as it may seem, a surprise "The console's in stores tomorrow!" isn't completely impossible. The oft-quoted Saturn debacle was largely a last-minute decision to get out ahead of the PlayStation, which meant there were barely any games available for months after launch, so the early release was seen as a failure.
Care to read the rest of my post? As I said, the Saturn launch was a disaster because they hadn't actually planned to have it out that early. Nintendo, by contrast, have been targeting a mid-2012 launch for a while, and so would potentially have the chance to do it right. Consider it pulling an Apple, rather than a Saturn
Hmm. Can the same be said nowadays? I think Nintendo releasing early is an indication they believe they are now in the same ocean and they need a head start in this yacht race.