• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs


Nano Assault Neo

Also, developers are on record saying that RAM is not a latency bottleneck in the WiiU and that the GPU is more powerful and modern over the curret gen GPUs.

The only bottleneck in the system is the CPU, but this shouldn't stop it from producing better graphics than current gen.

What we are looking at is unoptimized good that was rushed for launch.
 
The only thing that will stop the wii u producing games significantly better looking than the best of ps360 is time effort and budget unfortunately few exclusive games will have that all put in but there should be a couple at least
 
It clearly can not!

nano assault neo is nothing special and not even close to beyond what the 360 and PS3 can do!


Nano Assualt is pulling off the effect without using tricks. It's API level code easily accessible by the game code.

There is no need to rely on low level programming or prebaked/faked effects like current gen.

Will the results look the same? Maybe, but it is defintely easier to achieve on the WiiU.
 
Nano Assault Neo

Also, developers are on record saying that RAM is not a latency bottleneck in the WiiU and that the GPU is more powerful and modern over the curret gen GPUs.

The only bottleneck in the system is the CPU, but this shouldn't stop it from producing better graphics than current gen.

What we are looking at is unoptimized good that was rushed for launch.

Nano Assault Neo looks good and it is 60fps, but come on, it don't looks like "next gen", it don't show nothing really impossible to current gen consoles, and it use small enviroments.

And I am sure than Wii U can show better graphics than Xbox 360, but NAN is not a good example.



There is no need to rely on low level programming or prebaked/faked effects like current gen.

How do you know that they are not using "tricks" (as prebacked shadows) in the game?

Will the results look the same? Maybe, but it is defintely easier to achieve on the WiiU.

Any proof? And there are some current gen games using real time shadows.
 

TheD

The Detective
Nano Assualt is pulling off the effect without using tricks. It's API level code easily accessible by the game code.

There is no need to rely on low level programming or prebaked/faked effects like current gen.

You have no idea what you are talking about!

All real time rendering is using tricks!
GFX APIs (if needed) are always accessible by game code, they could not render anything otherwise!
The 360 and PS3 also have APIs and don't need to be programmed to the metal.
Pre baking is used in games on all platforms, the WiiU is not even close to powerful enough not to need it ever.
And it is not like the 360 and PS3 don't have games without it.
 

Kingven

Member
Wii U:

frieza-3-meme-generator-this-isn-t-even-my-final-form-b92b59.png
 
You have no idea what you are talking about!

All real time rendering is using tricks!
GFX APIs are always accessible by game code!
The 360 and PS3 also have APIs and don't need to be programmed to the metal.
Pre baking is used in games on all platforms, the WiiU is not even close to powerful enough not to need it ever.
And it is not like the 360 and PS3 don't have games without it.

True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.
 

QaaQer

Member
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.

Oh Reggie...
 

atbigelow

Member
Now, I like Nano Assault Neo quite a lot, but this is a silly exchange:

A: "NAN uses effects other systems can't!"

B: "No it doesn't."

A, to B: "PROVE IT"

How about A proves things first??? I love Shinen and NAN but it'd be nice if the original claim here was backed up first.
 

sp3000

Member
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.

I like looking into this thread just to see hilarious posts like this
 

Meelow

Banned
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.

Interesting.
 

The_Lump

Banned


Just gonna add some gasoline to this inferno: Trine 2 devs did day the Goblin Menace dlc included in the WiiU directors cut wouldn't technically be graphically possible on ps360...

The Wii U is capable of displaying Trine 2 expansion Goblin Menace with the graphics as intended, something developer Frozenbyte believes isn't possible on Xbox 360 and PS3.

..."Basically that does require... well, not huge amounts more graphics processing power, but still considerable. If we would publish that on the other consoles, then I believe that there would be some small downscaling of what it is right now."

I'm aware this could really be referring to anything, not necessarily a specific graphical effect. But....I like to watch the world burn so there it is ;)

Source: http://www.videogamer.com/wiiu/trine_2/news/wii_u_graphics_would_need_to_be_scaled_back_on_xbox_360_and_ps3.html
 
Just gonna add some gasoline to this inferno: Trine 2 devs did day the Goblin Menace dlc included in the WiiU directors cut wouldn't technically be graphically possible on ps360...



I'm aware this could really be referring to anything, not necessarily a specific graphical effect. But....I like to watch the world burn so there it is ;)

Source: http://www.videogamer.com/wiiu/trine_2/news/wii_u_graphics_would_need_to_be_scaled_back_on_xbox_360_and_ps3.html

Someone should tell them they are wrong because Black Ops 2 and GAF told me so.
 

Ryoku

Member
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.

What I find unfortunate is that people will read this (or not) and just post useless, uninformative responses along the lines of "lol". Or they will respond with equally useless posts like mine :/
 

Meelow

Banned
So, someone actually speaking about the technical side of the Wii U in a thread about specs, and you laugh it off...how very nice of you.

I guess people don't like hearing that the Wii U has potential to come out with great looking/running games.

Or I am just speaking my mind :(
 
What I find unfortunate is that people will read this (or not) and just post useless, uninformative responses along the lines of "lol". Or they will respond with equally useless posts like mine :/

I think that already happened......but I know what you mean. I think I made a pretty good case about the CPU though, but people will believe what they want.

It might have issues with running current gen games but how would the Wii U fair vs Xbox 360 when running the next gen games? That is what the system was designed more to run......since it has modern technology. It's not that hard to understand when you think about it.
 

NBtoaster

Member
Transparencies seem to be causing the bulk of issues in ports, in addition to AI. That's an issue with GPU and RAM, not the CPU. How would you solve that on the Wii U?

PS3 has a similar problem, and in 6 years there's never really been a solution other than to reduce their resolution, which can be a big ugly downgrade (GT5).
 

Van Owen

Banned
True, but the Wii U GPU is capable of doing all the "real" effects without faking them in these ports from the current gen HD systems. Third Parties however are not going to be putting extra resources in getting those effects to work on the Wii U - a platform that just launched and has no guarantee that their extra effort and money spent on ports will give them any money back in return.

The Wii U GPU is not an e6760 granted, but the GPU itself will perform very similar to that model with some "custom Nintendo tweaks" added in. 570-600 GFLOPS and all the modern effects like Compute Shading but in an OpenGL format of course......the GPU is not the cause of concern though right?

The Wii U CPU is a new and extremely efficient processor that does Out of Order Execution which has not been shown at all in these ports. The Wii U CPU is most very likely running these ports at "In Order Execution" thus only relying on the raw data transfer in parallel like an older style CPU similar to what is in the Xbox 360. Since it's clocked lower than those older "In Order" CPU's developers are having performance issues with the CPU. Understandable. All the modern CPU's are capable of Out of Order Execution including Sandy Bridge and Bulldozer: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=98&Itemid=1&limit=1&limitstart=9


Take a look at the Benefits of OoOE vs IoE: http://www.cs.washington.edu/education/courses/csep548/06au/lectures/introOOO.pdf

The Wii U CPU when developed with it's intended way of operation (OoOE) would run circles around the last gen CPU in the Xbox 360. A game like Black Ops 2 that has been in development for 2 + years and made to run from the ground up to take the most advantage of the 360 architecture first was never going to be able to run on Wii U just as smooth without a lot of optimizing and re-coding. The tiny Wii U Treyarch team (6-7 guys) should be congratulated that they were able to get the game running "good enough" for the launch of the console with probably less than a year of development.

Basically we are seeing Xbox 360 games being cut and pasted on the Wii U with very little optimizing and not taking advantage of the very important OoOE CPU that would leverage a lot better results if developers wanted to take the time to use it. Which they don't, and I totally understand the reasons why since it's just not cost effective for them and the game industry to spend a lot money on a new console with a small user base.

Points to take from this:

  • The Wii U CPU is modern and not old 7 year old tech
  • The Wii U CPU is not being used in these ports like it was designed to be used
  • Third Parties are happy porting over Xbox 360 games with minimal optimization and no improvements to graphics since it requires more money and work, which they can't afford.

Sorry, but the Wii U CPU is "horrible and slow".
 

The_Lump

Banned
So, someone actually speaking about the technical side of the Wii U in a thread about specs, and you laugh it off...how very nice of you.


I wish he'd explain why it's funny, rather than just fobbing it off.

Granted, some of it is speculative - but that's the point of this thread. And most of what he's saying is solidly based on fact and/or educated guesses/speculation from some reliable sources. At least he's backing his opinion up with some solid reasoning. Plus what he's saying makes a lot of sense. I guess it just doesn't make sense if you're looking at it through turd-tinted glasses to begin with...
 
I like looking into this thread just to see hilarious posts like this

Its kind of sad you spend time on a forum just to laugh at the more insightful/interesting posts on the forum. Suppose its better for a post to be funny than dull like your own.


Anyway interesting thoughts Trevelyan9999; its obvious that multiplatform is a major issue though. Large segments of games would have to be changed as all the things 'hiding' things going on in the background would have to change with whats going on behind the scenes.

For exclusives its great; for games that it just fits - good.
Otherwise I struggle to see how straight up ports are all that possible.

Thats Nintendo's choice ofc; they asked their people 'what do you need?' and this was it. Not third parties. I don't blame them all that much; even when they try they tend to be ignored.


Bayonetta 2 will be the very very interesting one.

I guess it just doesn't make sense if you're looking at it through turd-tinted glasses to begin with...

Problem at this stage is that everyone calling the WiiU out when the CPU problems were unknown and it just looked to have great RAM feel vindicated.
Makes the speculation all a bit pointless when people will create narratives about anyone being positive/negative about a system.
 

Ryoku

Member
I think that already happened......but I know what you mean. I think I made a pretty good case about the CPU though, but people will believe what they want.

It might have issues with running current gen games but how would the Wii U fair vs Xbox 360 when running the next gen games? That is what the system was designed more to run......since it has modern technology. It's not that hard to understand when you think about it.

I don't know what it up with some of the people on here regarding Wii U. Is it personal grudge? I have no idea, nor do I care much at this point. I'm still waiting for a game that shows off the system's capabilities. I gave up on trying to defend the system a while ago. Now they laugh at every time the term "GPGPU" or "modern feature set" is mentioned with no basis at all. Oh well :/
 

Durante

Member
Transparencies seem to be causing the bulk of issues in ports, in addition to AI. That's an issue with GPU and RAM, not the CPU. How would you solve that on the Wii U?

PS3 has a similar problem, and in 6 years there's never really been a solution other than to reduce their resolution.
Normally, this really would be a case where the answer is "eDRAM". You'd expect that the massive bandwidth proveded for the FB by storing it in the eDRAM would be perfect for rendering lots of transparencies.

So I don't really get what's going on there, since I'm sure that even the "dirtiest" port stores its FB in eDRAM.
 
Sorry, but the Wii U CPU is "horrible and slow".

The poster is talking about how the CPU functions. Its certainly slow.
Horrible? Well, based on what their saying it depends where your standing.

What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?

There ports. There no simple cut and paste jobs no, and I doubt other ports will improve massively (they will improve however); the point is that the tech is functional, just that resources have to be put in to really use it effectively.
 

squidyj

Member
I don't know what it up with some of the people on here regarding Wii U. Is it personal grudge? I have no idea, nor do I care much at this point. I'm still waiting for a game that shows off the system's capabilities. I gave up on trying to defend the system a while ago. Now they laugh at every time the term "GPGPU" or "modern feature set" is mentioned with no basis at all. Oh well :/

they're not magic wands that are going to fix what has been shown through the tdp and silicon budget. It's still bad.
 

The_Lump

Banned
What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?


Even of he doesn't, does that render his post nonsense?

It's a perfectly reasonable post. He's presented no evidence to back up his opinion which I would consider nonsensical. So no, I don't think it's nonsense.

Maybe discuss why you think it's nonsense, instead of rejecting it outright?
 

Diablos54

Member
What he's saying is nonsense. Does he have proof that every single one of those ports were cut and paste jobs?
*Looks at port performance*
*Looks at Wii U exclusive performance*

Seems pretty obvious to me. Obviously some tinkering had to be done, but he's pretty much right, the ports are... Well... Quick and dirty ports. Shock, I know.

It'll run IoE instructions very crappily--especially since it doesn't have the brute force to overcome the issue without major re-coding.
Really? I heard that OoOE CPU's can run IoE just as well as non OoOE CPU's (Dunno what to call them, lol).
 
I can understand where the trolls are coming from as it's a bit of an oversight by Nintendo to not develop a CPU strong enough to brute force emulate 7 year old console CPU functions.

Very few devs are going to take the time to re-engineer their game engines to take advantage of the WiiU OoOE CPU and unless MS and Sony do something horribly wrong ($599+) it's highly unlikely that the WiiU will be lead platform for multiplats or get non-moneyhat exclusives.

Nintendo 1st party games should be eye-melting at least

A CPU can be horrible and slow and still be modern and not 7 year old tech.
See the entire Tegra line.
 
No, and I'm not making any claims that they were or weren't. But I'm not basing any arguments around a dumb assumption.

Its in no way a 'dumb assumption' - its common sense.
The poster is saying that a lot of resources would have to go in to it.

Stop being an arse, you can disagree with an assumption but its hardly a dumb one.


Only games produced with the Wii U in mind (at all stages of development) will be really using the hardware as its intended. E.g. Nintendo. Or with co-development of WiiU/PS360 throughout.

Wii U was an after thought, am sure the devs worked hard (evident on things like BLOPS 2); and in the case of Sonic it appears they really got behind the hardware on it.

But they didn't have the resources required.
The posters argument is logically sound on all fronts. I suggest you read what they said again.
 
Top Bottom