I'm pretty sure rumors have said that it is, by how much is the question.
Ahh ok.Thanks
I'm pretty sure rumors have said that it is, by how much is the question.
I'm pretty sure rumors have said that it is, by how much is the question.
Yes but WILL IT HAS GOOD GRAPHIC?I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).
CPU
The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.
- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.
- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.
- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be substantially easier to program for, and should be more effective at running a lot of code, for instance AI.
There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.
GPU
The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:
- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.
- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.
- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.
- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.
RAM
It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.
Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:
- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).
- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.
- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).
- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.
Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.
Reggie probably meant the console can scale to 1080p.Here is a question:
Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.
So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?
Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.
The system is unveiled, so there's not much to speculate about. Feel free to make a post reveal general thread though.
My intention wasn't to bar people from having a thread about the Wii U, but rather I assumed people would make a Wii U Information Thread or something after the conference that actually included all the new information.
I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).
CPU
The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.
- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.
- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.
- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be substantially easier to program for, and should be more effective at running a lot of code, for instance AI.
There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.
GPU
The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:
- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.
- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.
- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.
- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.
RAM
It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.
Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:
- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).
- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.
- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).
- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.
Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.
Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.
Second element of response should come from EA Sports games, soon enough.Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.
Great post thraktor. You think it will be able to hit that UE4 minimum?
Well, paying Epic to optimise UE3 for WiiU specific builds would pretty much solve that problem for 90% of third party titles in a stroke.
Here is a question:
Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.
So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?
Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.
Here is a question:
Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.
So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?
Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.
Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.
Wait wait, the GC was more powerful than the PS2, nearly on the same level as Xbox, if not was at that level.
The console is definitely capable of producing graphics at 1080p (so were the PS3 and XBox 360, though). The thing is that most developers will target 720p instead because it frees up the GPU to do a lot more on screen, and most people don't have 1080p TVs anyway (and many that do wouldn't be able to tell the difference). We'll see a few 1080p games in the console's lifetime, but expect 720p for the most part.
Wait wait, the GC was more powerful than the PS2, nearly on the same level as Xbox, if not was at that level.
Here is a question:
Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.
So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?
Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.
Well, Naughty Dog didn't just sit down and learn the hardware, they're an incredibly technically talented team. I wish Factor 5 was still around at the moment, to be honest, as they were another team that could do amazing things with hardware, and a HD Rogue Squadron game by them for the Wii U could look amazing.
Anyway, I do think that third parties will get better use out of the hardware over time, but like the XBox 360 and PS3, the best looking games will be the exclusives.
Thats quite clear, but Im just wondering technically, can developers switch midstream from 720 to 1080 without much issue, or do they get locked in a particular resolution once they start development?
The system is unveiled, so there's not much to speculate about. Feel free to make a post reveal general thread though.
My intention wasn't to bar people from having a thread about the Wii U, but rather I assumed people would make a Wii U Information Thread or something after the conference that actually included all the new information.
Awesome post.*long and informative post*
Think about how powerful the Wii was compared to the Xbox, PS2, Gamecube etc.
The Wii U will be of a similar magnitude more powerful when compared to the 360 and PS3.
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.
Think about how powerful the Wii was compared to the Xbox, PS2, Gamecube etc.
The Wii U will be of a similar magnitude more powerful when compared to the 360 and PS3.
Blimey, can't believe we're discussing this. Forgive me if someone else apart from my good self has already pointed this out but this article is nonsensical for the following reasons:
1) The U has a DSP, the 360 has 1 out of the 6 total threads exclusively reserved for dealing with sound. That's 16 percent more power for a start, assuming that developers aren't going to set aside more threads to deal with sound (and most of them have double that, 2 out of 6 threads, a third of the 360's power faffing about with sound).
2) The U has a CPU that uses OoOE, the Xenon uses IOE meaning that the former will be more efficient than the latter.
3) The dev kits have 3GB of RAM and no OS, the OS uses a large amount of RAM (512MB reserved according to IdeaMan's sources) so we're looking at 2GB in total with 3 times more RAM available to developers.
4) Sources here have told us that the U's CPU has 3MB of eDRAM on-die compared to the 1MB that the Xenon has. The same sources have told us that the U's GPU has 32MB of eDRAM on-die compared to the 10MB that the Xenos has.
5) The U's GPU is at least 2 generations ahead of the RSX and Xenos and will have a fuller feature-set as a result.
I'm filing this under B for bollocks myself. On paper we're looking at a system around 3 times more powerful than the 360 but imo, in real-world performance terms, the OoOE CPU, GPU feature-set and eDRAM should push that to around 4 times more powerful.
Edit: Looks like I was beaten by Thraktor!
The WiiU version of Trine 2 is going to have better graphics than the 360/PS3 versions, according to the developers. So there's that.
But it's going to take a while for developers to build up their tools and get familiar with the system, whereas they've had a whole generation to do that with the 360/PS3. It's why I wouldn't be expecting most PS4/720 games to look that much better from the start. People easily forget what launch 360 games looked like.
So if this technical mumbo jumbo is correct , The Wii U can render the same amount of things on screen as the PS3 and 360 but have much more ram to have much more higher quality textures that don't look as muddy?
That's very optimistic both of you and Thraktor. But we wont know for certain until we see the CPU and GPU clock rates. And how many Flops are on the GPU. What we have seen so far is not looking good.I'm filing this under B for bollocks myself. On paper we're looking at a system around 3 times more powerful than the 360 but imo, in real-world performance terms, the OoOE CPU, GPU feature-set and eDRAM should push that to around 4 times more powerful.
Edit: Looks like I was beaten by Thraktor!
I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).
CPU
The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.
- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.
- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.
- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.
There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.
GPU
The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:
- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.
- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.
- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.
- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.
RAM
It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.
Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:
- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).
- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.
- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).
- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.
Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.
- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.
- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.
The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:
- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.
It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.
Yeah, that's the 'hard' number coming from some dev sources, but again not official, and I'm finding it more and more difficult to believe it, seeing how practically all games shown so far do not run in 1080p /noAA or have 720p with 4xAA (they mostly have no AA, which is baffling if edram value is true)- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA
All the eDRAM in the world doesn't make AA or 1080p free. It's also been speculated that the eDRAM isn't just a dedicated framebuffer like the 360 eDRAM.Yeah, that's the 'hard' number coming from some dev sources, but again not official, and I'm finding it more and more difficult to believe it, seeing how practically all games shows so far do not run in 1080p /noAA or have 720p with 4xAA (they mostly have no AA, which is baffling if edram value is true)
That's very optimistic both of you and Thraktor. But we wont know for certain until we see the CPU and GPU clock rates. And how many Flops are on the GPU. What we have seen so far is not looking good.
You have to remember that Nintendo themselves don't know how many Flops the GPU will push or the clockspeeds of any of the other components. The individual components were probably only finalised last month and Nintendo will be tweaking clocks, power and cooling right up until a month or two before launch.
Developing launch or launch window is a bloomin nightmare because developers and publishers end up with a constant stream of dev kit and SDK revisions going back and forth with said revisions at this stage creating artifacts, resetting or even burning out dev kits if someone at Nintendo royally cocks up lol.
People are generally under the misapprehension that launch and launch window titles aren't up to snuff as far as 'looking next gen' goes because developers need time to get used to working with the hardware but the main cause of this is the fact that 90% of the development is done on underpowered dev kits.
If they are intending to launch this Winter, that kind of stuff has to be nailed down. They cannot leave it 1 month before launch else they run into massive supply issues. They have to get that is way before that so they can actually produce retail units.
If they are intending to launch this Winter, that kind of stuff has to be nailed down. They cannot leave it 1 month before launch else they run into massive supply issues. They have to get that is way before that so they can actually produce retail units.
No, there isn't.Ahahahaaha MY SIDE! MY SIDE!!!
Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?
There are cell phones and tablets more powerful than this. This is hilarious.
Did you read those 2 big ass posts on this page? Cause it looks like you didn't...Ahahahaaha MY SIDE! MY SIDE!!!
Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?
There are cell phones and tablets more powerful than this. This is hilarious.
Pretty sure MS was in a similar situation in regards to Ram on the 360. I don't think it was until the final year of production on the system that they finally decided to head Epic and make it 512.
That being said... I do believe at this point, the devkits have been finalized, but several studios are still missing out on it.
It's common practice when launching a new console mate. Everything is done at the last minute.
I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).
CPU
The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.
- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.
- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.
- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.
The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.
There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.
GPU
The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:
- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.
- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.
- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.
- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.
RAM
It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.
Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:
- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).
- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.
- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).
- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.
Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.
Wow.Ahahahaaha!!! MY SIDE!!! MY SIDE!!!
Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?
There are cell phones and tablets more powerful than this. This is hilarious.
No, there isn't.