• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

This argument is getting tiring. It's already been confirmed that MS and Sony's consoles will be x86 based, while the WiiU is PowerPC based. These are vastly different architectures that will make porting very time consuming between the two.

The "core game" will not be close at all. In fact, there probably won't even be a core game since third parties will have given it the same status as the Wii at that point.

The armchair programmers in this thread are obvious, claiming to know better the developers actually working on the system

Are you counting yourself as an armchair programmer? Because as a programmer, I haven't had to worry at all about which CPU a system was using in years (well, except *once* when I ported a library written in Z80 assembly to ARM so we could use a proprietary compression library written for Gameboy Color on the iPhone). You know what I do when I need to switch to developing for a different CPU? I switch compilers, and sometimes some build settings. That's it. It doesn't require creating a new engine or anything, it doesn't make the porting take longer at all. Admittedly, I've never developed for the Wii U.

Unless you are programming in pure assembly (something that is rarely needed these days), it's up the compiler you use to optimize your code for the CPU. That and there are more and more middleware these days where you just switch the platform to Wii U or to XBox 360 and rebuild, and boom, most of your game is functional on the new platform.
 

AzaK

Member
Are you counting yourself as an armchair programmer? Because as a programmer, I haven't had to worry at all about which CPU a system was using in years (well, except *once* when I ported a library written in Z80 assembly to ARM so we could use a proprietary compression library written for Gameboy Color on the iPhone). You know what I do when I need to switch to developing for a different CPU? I switch compilers, and sometimes some build settings. That's it. It doesn't require creating a new engine or anything, it doesn't make the porting take longer at all. Admittedly, I've never developed for the Wii U.

Unless you are programming in pure assembly (something that is rarely needed these days), it's up the compiler you use to optimize your code for the CPU. That and there are more and more middleware these days where you just switch the platform to Wii U or to XBox 360 and rebuild, and boom, most of your game is functional on the new platform.

Thanks, you saved me a lot of typing.
 
Question: how would the eDRAM improve the overall Bandwdth of the main RAM?

If you mean overall bandwidth of the system memory, then the effect should be huge. Depending on the various bottlenecks and limitations of course... but in practice the most used assets should be stored in eDRAM, so the relatively slow main RAM bus will not be used as often. Looks like most of the games will be running in 720p, so there should 's plenty of room after frame buffer.

edit: We don't know that much about the memory architecture, for example how eDMRAN is shared between the CPU and GPU.
 

deviljho

Member
You know what I do when I need to switch to developing for a different CPU? I switch compilers, and sometimes some build settings. That's it. It doesn't require creating a new engine or anything, it doesn't make the porting take longer at all.

Wouldn't it be reasonable to say that the developers of launch day Wii U ports likely did exactly that you are saying? Or are the available tools so shitty that these ports' performance was still hampered?

I'm pretty sure that the "2nd wave of launch titles" like Pikmin, Wonderful 101 and Aliens: CM will better serve as the basis for comparison.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Are you counting yourself as an armchair programmer? Because as a programmer, I haven't had to worry at all about which CPU a system was using in years (well, except *once* when I ported a library written in Z80 assembly to ARM so we could use a proprietary compression library written for Gameboy Color on the iPhone). You know what I do when I need to switch to developing for a different CPU? I switch compilers, and sometimes some build settings. That's it. It doesn't require creating a new engine or anything, it doesn't make the porting take longer at all. Admittedly, I've never developed for the Wii U.

Unless you are programming in pure assembly (something that is rarely needed these days), it's up the compiler you use to optimize your code for the CPU. That and there are more and more middleware these days where you just switch the platform to Wii U or to XBox 360 and rebuild, and boom, most of your game is functional on the new platform.
That said, switching compilers can be a 'fun' experience. Heck, switching compiler versions can be a fun experience.
 

Septimius

Junior Member
Wouldn't it be reasonable to say that the developers of launch day Wii U ports likely did exactly that you are saying? Or are the available tools so shitty that these ports' performance was still hampered?

I'm pretty sure that the "2nd wave of launch titles" like Pikmin, Wonderful 101 and Aliens: CM will better serve as the basis for comparison.

Aside from CPU-architecture specific operations - like how I'm guessing the Vita CPU might have optimized architecture for encryption, and how there are CPUs made for decompress on the fly - changing CPU architecture shouldn't have an impact on performance. And given that many game programmers don't know assembly well enough to exploit architecture specific operations, it would not be an unfair assessment to say that most likely new CPU architecture compiling is the least of the bottlenecks on the Wii U. After all, it's more efficient to have someone that knows how to write a game, write a game, and have someone that knows how the CPU works write the compiler for it.

Of course, that means if you're developing for Wii U specifically, there might be certain things you can use to improve speed on some operations, but I doubt it's on anyone's mind.
 

TheD

The Detective
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.

You really have no clue.

The Wiiu is not even close to powerful enough not to need to fake things!
THAT IS THE FACT OF ALL REAL TIME RENDERING!

OoOE is automatic! If the WiiU has it, all games use it!

Are you counting yourself as an armchair programmer? Because as a programmer, I haven't had to worry at all about which CPU a system was using in years (well, except *once* when I ported a library written in Z80 assembly to ARM so we could use a proprietary compression library written for Gameboy Color on the iPhone). You know what I do when I need to switch to developing for a different CPU? I switch compilers, and sometimes some build settings. That's it. It doesn't require creating a new engine or anything, it doesn't make the porting take longer at all. Admittedly, I've never developed for the Wii U.

Unless you are programming in pure assembly (something that is rarely needed these days), it's up the compiler you use to optimize your code for the CPU. That and there are more and more middleware these days where you just switch the platform to Wii U or to XBox 360 and rebuild, and boom, most of your game is functional on the new platform.

You forget the fact that the next sony and ms consoles will have much more powerful CPUs than the WiiU, that will make porting very hard.
 

fritolay

Member
Yes it's getting very tiring. I don't claim to know more than a developer, just using common sense, something that has escaped many when it comes to the Wii U.

Even if the Next Gen Xbox and Playstation use x86, they will still be using a modern CPU that's capable of OoOE just like the Wii U and all modern CPU's. Wii U ports that are treated with care from a Third Party would be able to take advantage of what the system has to offer in it's own custom box and they will use these features once Wii U development starts taking off. Just like what developers have used to "fake" effects that look like something from DX11 on Xbox 360 and PS3, the same can be done on the Wii U, except the Wii U already is capable of those effects without faking. How much of those effects at the same time might be questionable though.

Yet again, people some how think that these games are going to look ugly or something when all it's going to be is a different kind of pretty. Diminishing returns will continue to diminish further and further and just how much money are developers willing to spend on Xbox 3 and PS4 to make their games closer to CGI quality? Is it worth it to them? You have to consider that as well.

Wii U seems like it's going to be a developer friendly system for devs looking to get some money back by releasing some nice quality efforts on the eShop. Not to mention if they do up their efforts on the ports they will earn consumer trust. Nintendo of course will need to show themselves what the system can really do in the end.

No doubt Zelda, Metroid, Smash Bros. and a new 3D Mario game will be so stunning graphically that all those higher specs of other consoles won't mean a damn thing when you see them running on Wii U.

Take a look at the first COD games on the XBOX 360 compared to today's games. Look at the outstanding graphics on Halo 4. This doesn't come from just porting stuff over. It comes from developing for the specific hardware it would seem to make sense to me. Otherwise they would have just had great graphics year 1 right?

Therefore to really get great graphics out of a system, you need to invest time and money. You also have to consider what the time and money will get you on a return and what the ceiling of the hardware is.
 

DieH@rd

Banned
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.


Whoa, let me get this straight. You seriously think that Radeon 4xxx architecture can produce 570-600 GFLOPS in system that has a total power consumption of 30-35W?

Now thats some serious wishful
[totally unrealistic]
thinking.
 

StevieP

Banned
This thread is an absolute disaster.

On a related note, I believe that there will be quite a few cross-generation games during the early years of the PS4/Durango life.

More than you know. Thing is, Wii U is still getting left out of many of these equations.

Sure there were PS2/Xbox ports, but in all those cases, the 360 version (Even the crappy GUN port) were clearly and demonstrably better than the previous-generation console versions. The ones that didn't have much in the way of an increase in model/environment details had a massive leap in resolution and image quality.

We're not seeing that with Wii U. At all.

Because the Wii U is not orders of magnitude more powerful than the previous generation. Facepalm?

Wouldn't it be reasonable to say that the developers of launch day Wii U ports likely did exactly that you are saying? Or are the available tools so shitty that these ports' performance was still hampered?

I'm pretty sure that the "2nd wave of launch titles" like Pikmin, Wonderful 101 and Aliens: CM will better serve as the basis for comparison.

Nintendo's tools through pre launch were probably kinda shit, considering compiler breaks, hard locks, hardware changes, etc. Hell they probably still are even with the final console.
 
This thread is an absolute disaster.



More than you know. Thing is, Wii U is still getting left out of many of these equations.



Because the Wii U is not orders of magnitude more powerful than the previous generation. Facepalm?



Nintendo's tools through pre launch were probably kinda shit, considering compiler breaks, hard locks, hardware changes, etc. Hell they probably still are even with the final console.
StevieP
Doesn't actually
understand technology or
have insider info.
 

Rolf NB

Member
The whole point of OOOE is that you don't need to fiddle with your code to get optimal performance. The CPU rearranges the instruction stream itself. In hardware. In realtime. To get through any slobbery mess of code as quickly and efficiently as possible.
 

wsippel

Banned
The whole point of OOOE is that you don't need to fiddle with your code to get optimal performance. The CPU rearranges the instruction stream itself. In hardware. In realtime. To get through any slobbery mess of code as quickly and efficiently as possible.
That statement is simplified to the point of being wrong. OoOE still require well optimized code. OoOE CPUs don't stall easily when waiting for operands, that's pretty much it.
 

Durante

Member
wsippel said:
That statement is simplified to the point of being wrong. OoOE still require well optimized code. OoOE CPUs don't stall easily when waiting for operands, that's pretty much it.
It's heavily simplified, but it's not really wrong. The oft-repeated statement that game code is currently optimized for in-order and therefore won't work well on out-of-order architectures is more problematic.

That said, switching compilers can be a 'fun' experience. Heck, switching compiler versions can be a fun experience.
Heh. The project I'm currently working on only compiles with a specific release of gcc (that's 4.6.3, not something "general" like 4.6) and a specific sub-version of boost.

Question: how would the eDRAM improve the overall Bandwdth of the main RAM?
The answer to the question as stated is "not at all". So I guess you wanted to ask something different.
 

wsippel

Banned
It's heavily simplified, but it's not really wrong. The oft-repeated statement that game code is currently optimized for in-order and therefore won't work well on out-of-order architectures is more problematic.
Even OoOE CPUs still do what they're told. They're certainly more efficient, but Rolf's post made it sound like optimized code isn't important for OoOE CPUs - and that's wrong.

Heh. The project I'm currently working on only compiles with a specific release of gcc (that's 4.6.3, not something "general" like 4.6) and a specific sub-version of boost.
Yeah, I love that stuff as well. My first ever C project had the same issue. And on top of that, it only worked in debug builds. Release builds always segfaulted.
 
Whoa, let me get this straight. You seriously think that Radeon 4xxx architecture can produce 570-600 GFLOPS in system that has a total power consumption of 30-35W?

Now thats some serious wishful
[totally unrealistic]
thinking.


I do think that 570-600flops is stretching things a bit, but one should consider that HD4770 (which is the only valid comparison of the HD4xxx line) had 960gflops with 75-80W power consumption (including 512MB of GDDR5 memory). Considering the size of the Wii U GPU relative to the CPU, the Wii U GPU could easily be in the 20 or even 25W range.
So considering various optimizations which should have increased efficiency and also significantly lower clock speeds 400-500gflops should be feasible.
 
It's heavily simplified, but it's not really wrong. The oft-repeated statement that game code is currently optimized for in-order and therefore won't work well on out-of-order architectures is more problematic.


Wouldn't code optimized for one CPU not be optimized for another? Especially considering two different compilers are used, the code being ported to WiiU must have changed, and therefore could have become not optimal in the process A lot of effort is expensed in just getting the code to compile, then debugged. Optimal performance is not a given just because you have the code compiled and bug free.
 
I do think that 570-600flops is stretching things a bit, but one should consider that HD4770 (which is the only valid comparison of the HD4xxx line) had 960gflops with 75-80W power consumption (including 512MB of GDDR5 memory). Considering the size of the Wii U GPU relative to the CPU, the Wii U GPU could easily be in the 20 or even 25W range.
So considering various optimizations which should have increased efficiency and also significantly lower clock speeds 400-500gflops should be feasible.

cough cough Radeon HD Mobility 4830 cough cough
 
I do think that 570-600flops is stretching things a bit, but one should consider that HD4770 (which is the only valid comparison of the HD4xxx line) had 960gflops with 75-80W power consumption (including 512MB of GDDR5 memory). Considering the size of the Wii U GPU relative to the CPU, the Wii U GPU could easily be in the 20 or even 25W range.
So considering various optimizations which should have increased efficiency and also significantly lower clock speeds 400-500gflops should be feasible.

The point is that the Wii U dev kit GPU's were based on the Radeon 4850 1GB card as a reference. The performance of that card reached close to 1 TFLOP......so the Wii U GPU even though it uses far less power (watts) will be able to achieve results similar to that GPU with less GFLOPS, hence my 500-600 estimate.


edit: remember too that the GPU was upgraded after the whole 4850 info came out......
 
The point is that the Wii U dev kit GPU's were based on the Radeon 4850 1GB card as a reference. The performance of that card reached close to 1 TFLOP......so the Wii U GPU even though it uses far less power (watts) will be able to achieve results similar to that GPU with less GFLOPS, hence my 500-600 estimate.


edit: remember too that the GPU was upgraded after the whole 4850 info came out......

And doesn't the e6760 offer similar performance to the 4850 despite low power use and only 60% of the glops
 
The point is that the Wii U dev kit GPU's were based on the Radeon 4850 1GB card as a reference. The performance of that card reached close to 1 TFLOP......so the Wii U GPU even though it uses far less power (watts) will be able to achieve results similar to that GPU with less GFLOPS, hence my 500-600 estimate.


edit: remember too that the GPU was upgraded after the whole 4850 info came out......

Did we ever get complete confirmation from a dev that the GPU in the early dev kits was practically a 4850?
 
cough cough Radeon HD Mobility 4830 cough cough


Mea culpa, forgot about those Mobility Radeons that are also using RV740. Technically it's also HD4750 which is using the RV740. My main point was/is that HD4770 has a GPU that uses the 40nm process and we also know how much power it uses.



Trevelyan9999 said:
The point is that the Wii U dev kit GPU's were based on the Radeon 4850 1GB card as a reference. The performance of that card reached close to 1 TFLOP......so the Wii U GPU even though it uses far less power (watts) will be able to achieve results similar to that GPU with less GFLOPS, hence my 500-600 estimate.


edit: remember too that the GPU was upgraded after the whole 4850 info came out......


Correct me if I'm wrong, but afaik we never got any official confirmation that Wii U dev kit's used the HD4850 GPU.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Bail out to blu's thread. Bomb dropped there that Nintendo apparently haven't synced the clocks this time around. My mind is blown. All clock speed estimates out the window.

Wat
 

OniShiro

Banned
Bail out to blu's thread. Bomb dropped there that Nintendo apparently haven't synced the clocks this time around. My mind is blown. All clock speed estimates out the window.

Not synched clocks would be a much worse performance wouldn't it? because components would have to loose cycles waiting for other components.
 

Alexios

Cores, shaders and BIOS oh my!
I don't see bombs dropped there, just some speculation on the frequencies. Which may be true, false or somewhere in the middle.
 

OniShiro

Banned
I mean how they are not synced. What does it mean and can it be fixed?

It means that some components will have to spend idle cycles waiting for the rest to end their work.

With synched clocks they would have to wait too, but less cycles are wasted.
 
I mean how they are not synced. What does it mean and can it be fixed?

Nintendo just has a habit of using speeds for various components that are synced, basically all multiplied (varying amounts) a base number, so if you know the speed of component you can take better guesses at the speed of others

As it now seems they aren't all in sync they could be anything
 
I do think that 570-600flops is stretching things a bit, but one should consider that HD4770 (which is the only valid comparison of the HD4xxx line) had 960gflops with 75-80W power consumption (including 512MB of GDDR5 memory). Considering the size of the Wii U GPU relative to the CPU, the Wii U GPU could easily be in the 20 or even 25W range.
So considering various optimizations which should have increased efficiency and also significantly lower clock speeds 400-500gflops should be feasible.

Looking at the die size and wattage, I don't think that's feasible out of a 35w machine. The GPU probably tops out at 25W, and the die space is occupied by several components that would otherwise not be there in a standard GPU to increase the size artificially.

Question: how would the eDRAM improve the overall Bandwdth of the main RAM?

eDRAM is not a savior. I wouldnt put too much stock in it.
 
Yes indeed. This is exactly the basis of what some developers are complaining about. They simply don't want to re-code games from the ground up to take advantage of the OoOE operations of the Wii U CPU when it's actually the most important feature to get the most of the CPU performance.

Hence = ports that run a tad worse than the made from the ground up games for Xbox 360.

Ports of Xbox 3 or PS4 would fair better since they would be built using the same principle of the Wii U CPU/GPU albeit those systems would be higher specs than Wii U. So basically the performance would be less on Wii U with a few corners cut but the core game should/would look very close to the other systems due to the Wii U's development being made with modern tech.

-See The Witcher 2 Xbox 360 vs PC

Bgassassin made a good assessment and I stand by what he said: Wii U's games would be like how PS2 vs Xbox Splinter Cell games were. The Xbox had more bells and whistles but the games weren't a huge leap over PS2.

I wouldn't worry about Van Owen, he is just upset because his mummy wont buy him a WiiU for Xmas, so he is now badmouthing it to anyone that will listen ;).

Like most 'U Trolls' his argument will shift from games being worse than the PS360 versions to games being worse than PS4 / 720 versions sometime next year once we see some of the big name exclusive WiiU games, built from the ground up, taking advantage of an OoO CPU, 32MB's of eDRAM, the extra Ram and the new effects a 2011 feature set GPU can achieve.

You would have to be pretty dumb, (which he is), to think that the WiiU can't handle Black Ops 2 which is based on a 1999 engine that ran on Dreamcast...

What about the games that are on par or even better on WiiU like Assassin's Creed 3, Sonic Racing 2 ect ?.

Ignore him if you want my advise.

He is a troll, nothing more.
 

MDX

Member
I don't think it matters much nowadays... It was more important in 2d consoles.


3DS afaik is synced too.

I find it strange after so many generations that Nintendo
is not using some kind of multiplier to balance their system.
Maybe due to the multi-core architecture its not so apparent
where the balance was made.
 
Top Bottom