• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

CPU Wii U just as powerful as PS3, X360, GPU 1,5 times stronger

I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).

CPU

The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:

- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.

- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.

- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.

- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.

The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be substantially easier to program for, and should be more effective at running a lot of code, for instance AI.

There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.

GPU

The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:

- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.

- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.

- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.

- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.

RAM

It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.


Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:

- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).

- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.

- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.

- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).

- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.


Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.
Yes but WILL IT HAS GOOD GRAPHIC?
 

MDX

Member
Here is a question:

Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.

So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?


Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.
 

Raide

Member
Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.
 

v1oz

Member
Here is a question:

Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.

So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?


Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.
Reggie probably meant the console can scale to 1080p.
 

Christine

Member
The system is unveiled, so there's not much to speculate about. Feel free to make a post reveal general thread though.

My intention wasn't to bar people from having a thread about the Wii U, but rather I assumed people would make a Wii U Information Thread or something after the conference that actually included all the new information.

Anybody mind if I do this?
 

Cygnus X-1

Member
I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).

CPU

The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:

- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.

- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.

- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.

- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.

The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be substantially easier to program for, and should be more effective at running a lot of code, for instance AI.

There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.

GPU

The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:

- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.

- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.

- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.

- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.

RAM

It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.


Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:

- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).

- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.

- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.

- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).

- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.


Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.

Very insightful. Thanks for sharing.
 
Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.

Well, paying Epic to optimise UE3 for WiiU specific builds would pretty much solve that problem for 90% of third party titles in a stroke.
 

marc^o^

Nintendo's Pro Bono PR Firm
Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.
Second element of response should come from EA Sports games, soon enough.
 

Thraktor

Member
Great post thraktor. You think it will be able to hit that UE4 minimum?

In theory it should have the functionality for UE4. The question is whether Epic will go to the trouble of porting it over when, even heavily optimized, it'll look a lot worse than it does on PS4 and the next XBox. I think it also depends a bit on how the console does in its early life in terms of demographics. For instance, if games like Blops 2 and Colonial Marines sell really well on Wii U, publishers will start to think of bringing their future "core games" to the console, and hence will start requesting a Wii U port of UE4. Basically it'll come down to a business decision for Epic; if they think a Wii U version is a selling point, they'll port it over, if they don't, they won't.
 

Raide

Member
Well, paying Epic to optimise UE3 for WiiU specific builds would pretty much solve that problem for 90% of third party titles in a stroke.

That is true. Now it depends if Epic are willing to rework UE3 for that purpose or are they already trying to shift people towards future UE4.
 

Van Owen

Banned
Here is a question:

Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.

So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?


Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.

Unless its a simple game like Rayman, expect 720p for 99% of Wii U games.
 

Thraktor

Member
Here is a question:

Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.

So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?


Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.

The console is definitely capable of producing graphics at 1080p (so were the PS3 and XBox 360, though). The thing is that most developers will target 720p instead because it frees up the GPU to do a lot more on screen, and most people don't have 1080p TVs anyway (and many that do wouldn't be able to tell the difference). We'll see a few 1080p games in the console's lifetime, but expect 720p for the most part.

Thanks for the post Thraktor. One of my main troubles seems to be the amount of things within the Wii-U that require Developers to really learn them to get the best out of them. How many 3rd Party Developers are going to go through that effort to get stuff running on the console? We saw what happened with the PS3, where the gulf between Developer who sat down and learned the Hardware (Naughty Dog etc) and everyone else was pretty massive.

Well, Naughty Dog didn't just sit down and learn the hardware, they're an incredibly technically talented team. I wish Factor 5 was still around at the moment, to be honest, as they were another team that could do amazing things with hardware, and a HD Rogue Squadron game by them for the Wii U could look amazing.

Anyway, I do think that third parties will get better use out of the hardware over time, but like the XBox 360 and PS3, the best looking games will be the exclusives.
 
Wait wait, the GC was more powerful than the PS2, nearly on the same level as Xbox, if not was at that level.

I know, but it ended up been irrelevant due to all the other crap Nintendo made at the same time, going from stupid discs to horrible thrid party support. They were never able to use this power advantage (unlike they did on the N64 against Saturn and PSX and with SNES x Mega Drive) so it´s pointless.
 

Zero148

Member
So, if we can take the information out of Thraktors post as confirmed, wouldn't that give reasons for the not better looking multiplats?

And furthermore, if a tech wise talented team (Retro for example) puts some serious effort into a WiiU game, this game could end up being really impressive visually and not doable on PS360?
 

MDX

Member
The console is definitely capable of producing graphics at 1080p (so were the PS3 and XBox 360, though). The thing is that most developers will target 720p instead because it frees up the GPU to do a lot more on screen, and most people don't have 1080p TVs anyway (and many that do wouldn't be able to tell the difference). We'll see a few 1080p games in the console's lifetime, but expect 720p for the most part.

Thats quite clear, but Im just wondering technically, can developers switch midstream from 720 to 1080 without much issue, or do they get locked in a particular resolution once they start development?
 

KageMaru

Member
Wait wait, the GC was more powerful than the PS2, nearly on the same level as Xbox, if not was at that level.

This is not true.

Here is a question:

Current WiiU (launch) games, including Nintendo's,
all seem to be running at 720p.
Reggie said this console could handle 1080p.
But he tends to embellish a lot. He might have meant streaming
movies at 1080p, but hoped the audience assumed games.

So what I want to know is, if a developer is pretty far in
their development of their game targeting 720p,
how difficult would it be for them switch to 1080p?


Im just wondering if developers targeted 720p based on early
dev kits, but newer kits might indicate future games in 1080p.

It's unlikely we'll see many 1080p games. The more the system is pushed, the less likely they are to raise the resolution.
 
Think about how powerful the Wii was compared to the Xbox, PS2, Gamecube etc.

The Wii U will be of a similar magnitude more powerful when compared to the 360 and PS3.
 

Raide

Member
Well, Naughty Dog didn't just sit down and learn the hardware, they're an incredibly technically talented team. I wish Factor 5 was still around at the moment, to be honest, as they were another team that could do amazing things with hardware, and a HD Rogue Squadron game by them for the Wii U could look amazing.

Anyway, I do think that third parties will get better use out of the hardware over time, but like the XBox 360 and PS3, the best looking games will be the exclusives.

Naughty Dog are utterly crazy and I cannot believe some of the stuff they pump out of a PS3. I would kill to see them do Multiplatform, just so other people see what they can do. Hell, would be interesting to see what they could draw out of the Wii-U

I guess the question is, who is brave enough to make those exclusives? I don't see Nintendo trying to buy GTAV. Rockstar just won't want to loose the cash. Epic just love the 360. So that might leave Nintendo as the ones forging ahead and you end up in the same situation as the Wii.
 

Thraktor

Member
Thats quite clear, but Im just wondering technically, can developers switch midstream from 720 to 1080 without much issue, or do they get locked in a particular resolution once they start development?

They can switch mid-development, but the problem is that if they were originally targeting 720p it'll require a hell of a lot of work to get the game running at a decent framerate at 1080p, so it's very unlikely that any games we've seen at E3 will switch, especially given the time constraints.
 
Blimey, can't believe we're discussing this. Forgive me if someone else apart from my good self has already pointed this out but this article is nonsensical for the following reasons:

1) The U has a DSP, the 360 has 1 out of the 6 total threads exclusively reserved for dealing with sound. That's 16 percent more power for a start, assuming that developers aren't going to set aside more threads to deal with sound (and most of them have double that, 2 out of 6 threads, a third of the 360's power faffing about with sound).

2) The U has a CPU that uses OoOE, the Xenon uses IOE meaning that the former will be more efficient than the latter.

3) The dev kits have 3GB of RAM and no OS, the OS uses a large amount of RAM (512MB reserved according to IdeaMan's sources) so we're looking at 2GB in total with 3 times more RAM available to developers.

4) Sources here have told us that the U's CPU has 3MB of eDRAM on-die compared to the 1MB that the Xenon has. The same sources have told us that the U's GPU has 32MB of eDRAM on-die compared to the 10MB that the Xenos has.

5) The U's GPU is at least 2 generations ahead of the RSX and Xenos and will have a fuller feature-set as a result.



I'm filing this under B for bollocks myself. On paper we're looking at a system around 3 times more powerful than the 360 but imo, in real-world performance terms, the OoOE CPU, GPU feature-set and eDRAM should push that to around 4 times more powerful.

Edit: Looks like I was beaten by Thraktor!
 
The system is unveiled, so there's not much to speculate about. Feel free to make a post reveal general thread though.

My intention wasn't to bar people from having a thread about the Wii U, but rather I assumed people would make a Wii U Information Thread or something after the conference that actually included all the new information.

Do you have any objection to it being called a speculation thread?
 
Think about how powerful the Wii was compared to the Xbox, PS2, Gamecube etc.

The Wii U will be of a similar magnitude more powerful when compared to the 360 and PS3.

That's not an accurate comparison, the WiiU's technical specifications are more impressive for its time than the Wii's were.
 

RedSwirl

Junior Member
- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.

This was a big problem with the Gamecube, and it became part of the reason why people stopped buying Gamecube versions of games. It turned into a vicious cycle.
 
I've been down on Nintendo like a lot people here last week, but I seriously cannot wait to see what their best teams are able to accomplish on the WiiU. It won't stack up to the 720 and PS4, but it obliterates the Wii like Thor's hammer would a soda can. If you enjoy the games Nintendo makes and you're not excited about that your brain isn't working properly.
 

Zee-Row

Banned
So if this technical mumbo jumbo is correct , The Wii U can render the same amount of things on screen as the PS3 and 360 but have much more ram to have much more higher quality textures that don't look as muddy?
 

nordique

Member
Think about how powerful the Wii was compared to the Xbox, PS2, Gamecube etc.

The Wii U will be of a similar magnitude more powerful when compared to the 360 and PS3.

No, not quite... see this post below

Blimey, can't believe we're discussing this. Forgive me if someone else apart from my good self has already pointed this out but this article is nonsensical for the following reasons:

1) The U has a DSP, the 360 has 1 out of the 6 total threads exclusively reserved for dealing with sound. That's 16 percent more power for a start, assuming that developers aren't going to set aside more threads to deal with sound (and most of them have double that, 2 out of 6 threads, a third of the 360's power faffing about with sound).

2) The U has a CPU that uses OoOE, the Xenon uses IOE meaning that the former will be more efficient than the latter.

3) The dev kits have 3GB of RAM and no OS, the OS uses a large amount of RAM (512MB reserved according to IdeaMan's sources) so we're looking at 2GB in total with 3 times more RAM available to developers.

4) Sources here have told us that the U's CPU has 3MB of eDRAM on-die compared to the 1MB that the Xenon has. The same sources have told us that the U's GPU has 32MB of eDRAM on-die compared to the 10MB that the Xenos has.

5) The U's GPU is at least 2 generations ahead of the RSX and Xenos and will have a fuller feature-set as a result.



I'm filing this under B for bollocks myself. On paper we're looking at a system around 3 times more powerful than the 360 but imo, in real-world performance terms, the OoOE CPU, GPU feature-set and eDRAM should push that to around 4 times more powerful.

Edit: Looks like I was beaten by Thraktor!


There are differences in the hardware, and we won't see them fully fleshed out for some time. I think Thraktor's post was also excellent, as was yours, and both should be on the OP of every Wii U GFX thread lol :p


Don't forget however, the GPU features additional customized features.
 

M3d10n

Member
The WiiU version of Trine 2 is going to have better graphics than the 360/PS3 versions, according to the developers. So there's that.

But it's going to take a while for developers to build up their tools and get familiar with the system, whereas they've had a whole generation to do that with the 360/PS3. It's why I wouldn't be expecting most PS4/720 games to look that much better from the start. People easily forget what launch 360 games looked like.

Trine 2 is mostly GPU grunt work, so it's no wonder it looks great. It also uses a deferred renderer, so the (rumored) 32MB EDRAM would help a lot.

Games like AC3 have large CPU workloads which are more complex to optimize. They're also various middleware which also need to be re-optimized for the new hardware. We don't know if the game is using dedicated audio DSP, which could free more than 10% of CPU resources depending on how much audio the game is doing as example.

I also suspect the Wii U CPU might have traded in-order performance for out-of-order performance. This requires a great deal of adjustments for PS360 engines to take most out of it.

For example of how important the architectural differences in a CPU are, just look at the CPU requirements for most console->PC ports. Some games require CPUs which are over than 5 times more powerful than the 360 CPU to run as well.

I'd really like to see a port of Rage. The extra RAM would do wonders for the pop-in.
 

nordique

Member
So if this technical mumbo jumbo is correct , The Wii U can render the same amount of things on screen as the PS3 and 360 but have much more ram to have much more higher quality textures that don't look as muddy?

In the Wii U's case, yes, much richer textures, and better lighting/shadow effects.

The lighting and shadow effects are where the Wii U (and all other next gen consoles) will outshine the current HD systems.

Nintendo Land for example, has a really interesting lighting system. It looks fantastic. Its something that was immediately apparent to me watching the trailer.
 

v1oz

Member
I'm filing this under B for bollocks myself. On paper we're looking at a system around 3 times more powerful than the 360 but imo, in real-world performance terms, the OoOE CPU, GPU feature-set and eDRAM should push that to around 4 times more powerful.

Edit: Looks like I was beaten by Thraktor!
That's very optimistic both of you and Thraktor. But we wont know for certain until we see the CPU and GPU clock rates. And how many Flops are on the GPU. What we have seen so far is not looking good.
 

Kilrogg

paid requisite penance
I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).

CPU

The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:

- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.

- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.

- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.

- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.

The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.

There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.

GPU

The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:

- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.

- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.

- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.

- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.

RAM

It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.


Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:

- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).

- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.

- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.

- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).

- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.


Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.

Looks like brain_stew brought a friend. I'm good with that.
 
- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.

Being true, the actual gain will be dependant over the prefetching logic of the new CPU. How many transistors will be IBM using there and so. I always thought Xenos would be a better processor being a dual core with OoO.

- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.

This is a huge difference. Xenos have so little cache (1MB), whereas Cell have a laughable one (512KB).

- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.

I have hard times believing sound processing would take an entire thread of Xenos. This is true in case of Cell cause of the SPE private caches, but not in a modern cpu like Xenos.

The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.

"bit less raw power" in what? Both Xenos and this new CPU outperform hard Cell in general purpose computing. Cell is very strong in floating point, but a weak CPU overall.


The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:

x1.5 numbers could be true if this GPU is like those embedded CPU+GPU (marketed by AMD as APU), with lackluster bandwith and shared resources. You can throw in as many shaders and mhz as you want. Without proper bus width, ROPs, etc, performance will suffer a lot.

- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.

Even DX11 compliant tesselator from AMD is very weak vs Nvidia series. Better than nothing, though.

It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.

CPU's don't benefit at all from higher bandwith. x3 RAM size should be a huge leap and make the difference here.


This machine should be able to mopping the floor with both 360/PS3. But into the same generation league. This will be very much like a Genesis/Megadrive to NeoGeo leap. Tuned up CPU, better GPU (VDP), more ram.
 

Lord Error

Insane For Sony
- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA
Yeah, that's the 'hard' number coming from some dev sources, but again not official, and I'm finding it more and more difficult to believe it, seeing how practically all games shown so far do not run in 1080p /noAA or have 720p with 4xAA (they mostly have no AA, which is baffling if edram value is true)
 

wsippel

Banned
Yeah, that's the 'hard' number coming from some dev sources, but again not official, and I'm finding it more and more difficult to believe it, seeing how practically all games shows so far do not run in 1080p /noAA or have 720p with 4xAA (they mostly have no AA, which is baffling if edram value is true)
All the eDRAM in the world doesn't make AA or 1080p free. It's also been speculated that the eDRAM isn't just a dedicated framebuffer like the 360 eDRAM.
 
That's very optimistic both of you and Thraktor. But we wont know for certain until we see the CPU and GPU clock rates. And how many Flops are on the GPU. What we have seen so far is not looking good.

You have to remember that Nintendo themselves don't know how many Flops the GPU will push or the clockspeeds of any of the other components. The individual components were probably only finalised last month and Nintendo will be tweaking clocks, power and cooling right up until a month or two before launch.

Developing launch or launch window is a bloomin nightmare because developers and publishers end up with a constant stream of dev kit and SDK revisions going back and forth with said revisions at this stage creating artifacts, resetting or even burning out dev kits if someone at Nintendo royally cocks up lol.

People are generally under the misapprehension that launch and launch window titles aren't up to snuff as far as 'looking next gen' goes because developers need time to get used to working with the hardware but the main cause of this is the fact that 90% of the development is done on underpowered dev kits.
 

Raide

Member
You have to remember that Nintendo themselves don't know how many Flops the GPU will push or the clockspeeds of any of the other components. The individual components were probably only finalised last month and Nintendo will be tweaking clocks, power and cooling right up until a month or two before launch.

Developing launch or launch window is a bloomin nightmare because developers and publishers end up with a constant stream of dev kit and SDK revisions going back and forth with said revisions at this stage creating artifacts, resetting or even burning out dev kits if someone at Nintendo royally cocks up lol.

People are generally under the misapprehension that launch and launch window titles aren't up to snuff as far as 'looking next gen' goes because developers need time to get used to working with the hardware but the main cause of this is the fact that 90% of the development is done on underpowered dev kits.

If they are intending to launch this Winter, that kind of stuff has to be nailed down. They cannot leave it 1 month before launch else they run into massive supply issues. They have to get that is way before that so they can actually produce retail units.
 
If they are intending to launch this Winter, that kind of stuff has to be nailed down. They cannot leave it 1 month before launch else they run into massive supply issues. They have to get that is way before that so they can actually produce retail units.

Pretty sure MS was in a similar situation in regards to Ram on the 360. I don't think it was until the final year of production on the system that they finally decided to head Epic and make it 512.

That being said... I do believe at this point, the devkits have been finalized, but several studios are still missing out on it.
 
If they are intending to launch this Winter, that kind of stuff has to be nailed down. They cannot leave it 1 month before launch else they run into massive supply issues. They have to get that is way before that so they can actually produce retail units.

It's common practice when launching a new console mate. Everything is done at the last minute.
 
Ahahahaaha!!! MY SIDE!!! MY SIDE!!!

Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?

There are cell phones and tablets more powerful than this. This is hilarious.

Even if its because of the Wii U's architecture, unless the thing is a Sega Saturn this is still a massive disappointment. Even if Nintendo was going to use bottom of the barrel components the system still should blow away the PS3/360.
 

Vic

Please help me with my bad english
Ahahahaaha MY SIDE! MY SIDE!!!

Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?

There are cell phones and tablets more powerful than this. This is hilarious.
No, there isn't.
 

Diablos54

Member
Ahahahaaha MY SIDE! MY SIDE!!!

Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?

There are cell phones and tablets more powerful than this. This is hilarious.
Did you read those 2 big ass posts on this page? Cause it looks like you didn't...
 

Raide

Member
Pretty sure MS was in a similar situation in regards to Ram on the 360. I don't think it was until the final year of production on the system that they finally decided to head Epic and make it 512.

That being said... I do believe at this point, the devkits have been finalized, but several studios are still missing out on it.

This is can understand, especially with such a push from Epic.


It's common practice when launching a new console mate. Everything is done at the last minute.

This sounds like a crazy train bound for RRoD. Like games, at some point it has to go gold and go into retail production. Do Hardware devs really leave it till last minute? They have 6 months left of this year.
 

Mlatador

Banned
I thought I'd chime in with a few of the things we know about the Wii U's hardware from the speculation threads (and by "know" I mean info which has been confirmed by multiple reliable sources).

CPU

The Wii U's CPU is a three-core, dual-threaded, out-of-order IBM Power ISA processor with 3MB of eDRAM L2 cache. Superficially it looks pretty similar to the Xenon CPU in the XBox 360, but it's a completely new CPU, and there are a number of important differences from Xenon:

- Firstly, it supports out-of-order execution. Roughly speaking, this means that the processor can alter the order it executes instructions to operate more efficiently. The benefit of this depends on the kind of code being run. Physics code, for example, wouldn't see much benefit from an out-of-order processor, whereas AI code should run significantly better. Out-of-order execution also generally improves the processor's ability to run poorly optimized code.

- Secondly, we have the larger cache (3MB vs 1MB). The Xenon's cache was actually pretty small for a processor running 6 threads at 3.2.GHz, causing a lot of wasted cycles as threads wait for data to be fetched from main memory. The Wii U CPU's larger cache should mean code runs much more efficiently in comparison, particularly when combined with the out-of-order execution.

- The Xenon processor used the VMX128 AltiVec unit (or SIMD unit), which was a modified version of IBM's then-standard VMX unit, with more gaming-specific instructions. It appears that the Wii U's CPU will feature a highly customized AltiVec unit itself, possibly based off the newer VSX unit. This should substantially increase the efficiency of a lot of gaming-specific code, but the important thing is that, unlike the out-of-order execution and large cache, developers have to actively make use of the new AltiVec unit, and they have to really get to know how it operates to get the most out of it.

- The Wii U has a dedicated DSP for audio and a dedicated I/O processor. These relieve the CPU of a lot of work, for instance there are XBox 360 games which require an entire core to handle audio.

The CPU should have quite a bit less raw power than the PS3's Cell, although the same will most likely be true for both the PS4 and next XBox. It will, however, be significantly easier to program for, and should be more effective at running a lot of code, for instance AI.

There aren't any reliable sources on the CPU's clock speed, but it's expected to be around 3.2Ghz or so.

GPU

The GPU is likely to be VLIW-based, with a pretty modern feature-set and 32MB of eDRAM. We don't have any reliable numbers on either SPU count or clock speed, but in bullshit multiplier comparisons to the Xenos (XBox 360's CPU), most indications are that it's closer to 2 or 3 times the raw power of Xenos, as opposed to the 1.5 times quoted in the OP. There are a few things we do know about the GPU though:

- The 32MB of eDRAM is the only hard number we have about the GPU. This is more than three times the size of the eDRAM framebuffer on Xenos, and should allow games to achieve either 720p with 4x AA or 1080p with no AA, without having to do tiling (the need to tile AA'd HD images on the Xenos's framebuffer made its "free" AA a lot less free). It's also possible (although unconfirmed) that the eDRAM is on-die with the GPU, as opposed to on-chip (and hence on another die). If true, this means that the eDRAM will have much lower latency and possibly much higher bandwidth than the XBox 360's set-up. Developers will have to actively make use of the eDRAM to get the most out of it, though.

- The GPU features a tesselator. However, we have no idea whether it's a 4000-series tesselator (ie not very good) or perhaps a more modern 6000-series tesselator (a lot better). Again, developers would have to actively make use of this in their game engines.

- The GPU is heavily customized and features some unique functionality. Although we don't have any reliable indications of what sort of functionality Nintendo has focused on, it's been speculated that it's related to lighting. Apparently games which make good use of this functionality should see substantial improvements in performance. More than any other feature of the console, though, developers really need to put in the effort to optimize their engines for the GPU's customizations to get the most out of them.

- The GPU has a customized API, based on OpenGL. Regular OpenGL code should run, but won't run very well and won't make any use of the GPU's custom features. Developers will need a good understanding of the GPU's API to get the most out of it.

RAM

It seems the console will have either 1.5GB or 2GB of unified RAM, with indications that Nintendo were targeting 1.5GB with earlier dev-kits and later increased that to 2GB. We don't know the kind of RAM being used, but most expect DDR3, probably with a 128-bit interface and clock speed somewhere in the 750MHz to 1Ghz range, resulting in a bandwidth somewhat, but not significantly, higher than the XBox360 and PS3. It's worth noting that the large CPU cache and GPU eDRAM somewhat mitigate the need for very high bandwidths. It's possible, but quite unlikely, that they're using GDDR5, which would mean a much higher bandwidth.


Going by what we know about the console's hardware, it should be able to produce games which noticeably out-perform what's available on XBox 360 and PS3, so long as everything's properly optimized. Of course, performance will still be far behind the PS4 and next XBox. What we're seeing at E3 is unlikely to be well optimized for a number of reasons:

- "Final" dev-kits, with actual production hardware, only started to arrive to developers a few weeks ago. This would be too late for the E3 demos to make any real use of any improvements this final hardware may have brought. We know that these dev-kits brought a slight improvement in performance, but we don't know if there were any changes in functionality (eg to the eDRAM, which could indicate why we're seeing so little AA).

- Nintendo don't seem to have locked down the clock speeds yet, which makes it difficult for developers to properly optimize games for the hardware. As Nintendo now has final production hardware to do thermal testing on, final clock speeds should come pretty soon.

- For third party multi-plats, the XBox360 and PS3 versions are going to sell the most (due to higher install-bases), so developers are going to put more resources towards those versions, and are likely to put the more talented team-members on XBox360 and PS3 development as well. Because they can get PS360-grade performance out of the Wii U with a quick, poorly optimized port, most aren't going to bother putting the time and money into substantially improving the Wii U version.

- We've only seen launch-window titles, and launch-window titles that are about five months from completion, at that. I can only think of a single case where a game for new hardware was actually well optimized at this point before the launch of the console (Rogue Leader for Gamecube).

- While third parties are unlikely to make good use of the hardware, Nintendo haven't shown any games from the first party studios most likely to really push the hardware (eg Retro, Monolith, EAD Tokyo, EAD Kyoto Group 3). These studios are the ones to watch for technically impressive games in the first couple of years of the Wii U's life.


Interestingly, the best-looking game that's been shown off thus far is probably Platinum's Project P-100. While people haven't been focusing on it from a technical perspective that much because of the art style, it's got great textures, good polygon detail, very nice lighting, good effects, a nice DoF effect, the IQ seems good and the framerate seems smooth. In some parts it also does nice 3D visuals on both the TV and controller screen. I wouldn't go so far as saying it looks substantially better than anything we've seen on PS360 (certainly not without seeing it in person), but it's definitely a nice looking game.

Thanks a lot! Soo informative, I couldn't stop reading!
 

snesfreak

Banned
Ahahahaaha!!! MY SIDE!!! MY SIDE!!!

Holy shit Nintendo! AS poweful as the PS3/360 almost a decade later? And the GPU is only slightly more powerful as well?

There are cell phones and tablets more powerful than this. This is hilarious.
Wow.
Not only are you wrong about cell phones but you clearly haven't read through this thread.
 
Top Bottom