• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Rolf NB

Member
Sure, because MEM1 is no framebuffer.
Even if that was more than wishful thinking, the point remains that it's a tiny fraction of the memory. 1.5% of your total. Even if it can be used for anything but framebuffer, it still can't be used for much else. The thing you refer to as Gamecube's MEM1 OTOH is main memory, 57% of the total, usable for a majority of things. That's why your comparison is off either way.

You should compare WiiU eDRAM to Flipper/Broadway eDRAM. Flipper eDRAM is 4.7% of the total in Gamecube, 2.2% in Wii. Flipper eDRAM provides 18GB/s bandwidth, 6x faster than its main memory. See how this comparison makes way more sense?

wsippel said:
The MEM1/ MEM2 hierarchy is an obvious evolution of the main RAM/ aux RAM concept. ARAM/ MEM2 became much bigger, but also much faster over the years compared to MEM1, but the idea still is to have very fast memory for highly performance critical stuff and "mass storage" for everything else. Is 32MB enough? No idea, but MEM2 certainly doesn't need to be bigger than MEM1 with current workloads.
Gamecube ARAM was little more than a glorified disc streaming buffer, too slow to be useful for anything but that and audio. Expanding it in the Wii to the point where it became the biggest pool was a crutch to get the total up, and only happened because Nintendo was already in full-on penny-pinching mode.

But we won't be holding up the Wii as an example of a performance balanced system to begin with.
 

Terra

Member
Look at Xenoblade for a good example what the Wii could muster up. The Wii U will do just fine in terms lf technology.

Xenoblade is an exceptional title but its far from a graphical showpiece. The gap between it and Monolith's last PS2 game Xenosaga 3 is not particularly massive, despite the PS2 being hands down the weakest of the "big three" consoles from it's era.

Or compare it to another similarly open world "real battlefield combat" RPG in Final Fantasy XII, also on the far inferior PS2 hardware compared to the Wii. Scope, draw distance, poly count, texture detail, IQ, etc. are generally quite comparable.

One of the best games in recent history, but as a technical achievement it isn't a great argument for Monolith tapping into some great reserve of hidden power within the Wii.


I beg to differ on both the comparisons. The 'open world' of Final Fantasy XII is not even the slightest comparable to the one in Xenoblade.
And was the PS2 really far inferior to the next-gen consoles? Look at the graphics in God Of War at the end of its cycle.
 

ugoo18

Member
i just dont think people are giving nintendo enough credit. people seem to act as if they are new to the game and would bottleneck a system because of incompetence. we cant just look at the numbers and assume we know what this console can or will do. you have to take the system as a whole. i think possibly years from now we will be amazed at what the Wii U accomplishes and we will kinda laugh at the debates we are having now. but like all things only time will tell.

Im just hoping years from now the WiiU has 3rd party support equal to or better than the Gamecube (When i say this i mean in regards to stuff like DICE actually wanting Battlefield for example to be a Gamecube exclusive but lack of an online strategy absolutely bungled this. Nintendo were all for it as well which makes it even worse lol).
 

Rolf NB

Member
i just dont think people are giving nintendo enough credit. people seem to act as if they are new to the game and would bottleneck a system because of incompetence.
Not incompetence. They are scrooges. They squeeze every last fraction of a penny out of hardware. That's not a form of incompetence.
Tron said:
we cant just look at the numbers and assume we know what this console can or will do. you have to take the system as a whole. i think possibly years from now we will be amazed at what the Wii U accomplishes and we will kinda laugh at the debates we are having now. but like all things only time will tell.
Agreed on the laughing only. We know well enough from the silicon budgets and bandwidths where this is going to fall. Chip design is a matter of science and physics, not faith and magic. You can't somehow perform the same amount of computational work in half the transistors at half the clocks, just because you're Nintendo.
 

Drek

Member
I beg to differ on both the comparisons. The 'open world' of Final Fantasy XII is not even the slightest comparable to the one in Xenoblade.
And was the PS2 really far inferior to the next-gen consoles? Look at the graphics in God Of War at the end of its cycle.

1. The Wii has nearly three times the total ram of the PS2. of course the world would be able to have greater size. They aren't orders of magnitude more complex however, and similar worlds were realized on the PS2 (the entire Grand Theft Auto series for example). That isn't much of an argument for technical proficiency.

2. Yes, the PS2 was really significantly inferior. It had far too little memory compared to it's peer group (Gamecube had nearly 50% more, some significantly superior chips, and better allocation. Xbox had twice as much in a similar shared pool architecture), an architecture that was problematic industry wide (though somewhat forward thinking), and it's biggest advantage (massive ram bus) was basically required to cover up for a near generational gap in terms of hardware feature set (many PS2 games used the fat ram bus to render at much higher resolutions than final output as a "cheat" for anti-aliasing, a hardware included feature in both the Gamecube and Xbox).

3. The God of War series has historically had massive reliance on pre-rendered backgrounds. It wasn't until God of War 3 on PS3 before the series saw much in the way of real time background details. It isn't a technical benchmark of anything, other than what very strong artistic design can do regardless of hardware.
 

wsippel

Banned
Not incompetence. They are scrooges. They squeeze every last fraction of a penny out of hardware. That's not a form of incompetence.
It's also not true. There are a lot of instances where Nintendo went with the more expensive option. If anything, the original Xbox360 was pure penny pinching. The cheapest, shoddiest piece of hardware ever designed - because every single penny went into adding features and making the thing faster, and nobody at MS gave a shit that the system would probably fall apart if you did as much as sneeze in the same room.
 

Rolf NB

Member
What was Wii's imbalance?
Compute starved. They more than tripled aggregate main memory bandwidth and more than doubled its size -- or if you take into account that Gamecube ARAM was too slow for any meaningful working set, unlike in Wii, you might as well say they scaled it by 3.6 times. OTOH GPU and CPU performance scaled only by 50%. Plus framebuffer size restrictions.

Based on the assumption that we regard Gamecube as a balanced, efficient design, we simply have to observe how unevenly the components in Wii were scaled up.
 
Microsoft doubled their RAM from initial plans at the behest of third party developers at a purported cost north of $1B. I wouldn't call them "cheap", overall. Poor build quality, better components. More prioritization presumably should have been given to build, but hindsight is 20/20. I don't think they'll repeat that error.
 

Drek

Member
It's also not true. There are a lot of instances where Nintendo went with the more expensive option. If anything, the original Xbox360 was pure penny pinching. The cheapest, shoddiest piece of hardware ever designed - because every single penny went into adding features and making the thing faster, and nobody at MS gave a shit that the system would probably fall apart if you did as much as sneeze in the same room.

There are a lot of times where the best value is the most expensive option, FYI. That doesn't mean it isn't still penny pinching to do a cost:value analysis and choose the best ratio.

The PS3 is hardware design against this philosophy where Sony's desire for "value added" tech resulted in pushing Blu-Ray when it wasn't truly required, an intricate cell architecture that while theoretically superior to it's competitors was a chore to work with, etc..

Nintendo meanwhile consistently chooses the best bang for the buck components and plays it safe with regards to non-standard tech that doesn't pay immediate dividends. They're very good at hardware budgeting, but this doesn't mean they get to ignore the laws of mathematics and physics, resulting in a system that out-performs it's theoretical limits.

This is why the Wii U will be clearly under powered compared to it's upcoming competition. Even if Nintendo achieved a zero loss architecture where every game could run at maximum FLOPS, poly count, etc. (whatever theoretical throughput metric you fancy) that theoretical maximum is still going to be below the baseline expected from PS4/X720 hardware. That's just reality. They're systems a full generational leap in hardware ahead of the Wii U.

I do look forward to what the Wii U achieves from a technological standpoint however, if only to tell us what Nintendo might have offered in '06 instead of the Wii if they'd stuck with the traditional hardware battle.
 

wsippel

Banned
Microsoft doubled their RAM from initial plans at the behest of third party developers at a purported cost north of $1B. Poor build quality, better components.
Well, I'd also consider the drive a "component", and there's nothing good about the drives Microsoft used. Every single penny went into CPU, GPU, RAM and licenses. A teardown from 2006 suggests that Wii peripherial components and manufacturing were several times as expensive (though obviously not as expensive as PS3 was in that regard).


This is why the Wii U will be clearly under powered compared to it's upcoming competition. Even if Nintendo achieved a zero loss architecture where every game could run at maximum FLOPS, poly count, etc. (whatever theoretical throughput metric you fancy) that theoretical maximum is still going to be below the baseline expected from PS4/X720 hardware. That's just reality.
Yes, both those systems will be a lot more powerful.

They're systems a full generational leap in hardware ahead of the Wii U..
No. Unless you consider the original Xbox a full generation ahead of the PS2, or the Dreamcast the same generation as the PS1.
 
No. Unless you consider the original Xbox a full generation ahead of the PS2, or the Dreamcast the same generation as the PS1.

except theres no indication yet the wii u is more powerful than the ps3 and 360.

and now that we know the pathetic bandwidth, i wouldn't necessarily count on it.
 
This is why the Wii U will be clearly under powered compared to it's upcoming competition. Even if Nintendo achieved a zero loss architecture where every game could run at maximum FLOPS, poly count, etc. (whatever theoretical throughput metric you fancy) that theoretical maximum is still going to be below the baseline expected from PS4/X720 hardware. That's just reality.

You say this as if anyone with a brain denies it. That is not what is being discussed in this thread. People are trying to understand what Nintendo's latest console is capable of.
 
Cost is always going to be a prevailing restriction of design unless Ken Kutaragi is designing a system.

But cost is clearly more of a prohibitive feature to Nintendo in their design philosophy considering they normally intend to sell hardware for a profit and/or with regard to recent form try to target a lower price.

And within that envelope of cost, certain priorities come into play. As you say, Microsoft focused their efforts within that cost restriction on silicon and RAM, on performance, forgoing cost into things that would have improved build quality. With their new effort it does seem like a major priority is to integrate the Windows 8 ecosystem into the XBOX system.

Nintendo's priorities with this system have been, as far as I can tell, to include a touch screen controller, be comparable to current systems, maintain backwards compatibility, produce a small form factor and have a tiny power draw - not necessarily in that order. All while being as inexpensive as possible.

I don't really know if there's anything to indicate that they really care if there are bottlenecks for other developers porting games designed primarily for other systems. IMHO, I don't think they do care, or at the least they don't make it a priority in their design philosophy.
 

NBtoaster

Member

Drek

Member
You say this as if anyone with a brain denies it. That is not what is being discussed in this thread. People are trying to understand what Nintendo's latest console is capable of.

Really? Because I quoted someone who actually tried to make a "Xbox 360 was supposedly 8x less powerful than the PS3, they got the same games!" argument.

And that's the really out there comment. The fact that people think 3rd parties will give the Wii U a second look short of Nintendo money hats once the PS4 and X720 are out is delusional.

Hell, the guy I responded to has already tried to argue that the gap between the Wii U and the next Sony/MS systems isn't a generational leap forward in hardware. See here:

No. Unless you consider the original Xbox a full generation ahead of the PS2, or the Dreamcast the same generation as the PS1.
Sorry, but this is just foolish. The CPU is still PowerPC architecture while both the PS4 and X720 will have AMD's Jaguar. That is more than a few CPU architecture generations apart, as PowerPC 750 wasn't even just the base for the Wii's CPU, but also the Gamecube's. There is only so much amping up the horse power will do for chipset architecure, and the PowerPC line is damned long in the tooth.

The ram is 2 GB of DDR3-1600. At best Microsoft goes with the same, just 4 times as much. A generational leap in quantity if nothing else, though we'd be foolish to presume they stick with 1600 ram, especially given what Sony is doing, opting for 4 GB of GDDR5 which is double the Wii U's memory with massively superior chips. Sony at least is clearly taking a noteworthy generational leap forward here.

Most rumors that I've seen are that the Wii U's "Latte" GPU is based on Radeon 5000 series architecture, while the PS4 and X720 are both on at least 7000 series architecture. Another clear generational leap.

This is just outright hardware generation revisions by the hardware suppliers. This isn't some stupid "generation" classification applied to each console iteration. Saying the Wii U is the same "generation" of hardware would be like arguing that a late 200x's off the shelf desktop is the same "generation" of hardware as something of similar cost off the shelf in 2012 - i.e. not remotely the same.
 

andthebeatgoeson

Junior Member
Or a similar situation to the PS3/360..Wasnt the ps3 sposed to be like "8x" more powerful, yet Uncharted 3 looks like GOW3. Its extremely presumptuous to think the unannounced du/or are gonna be the dominant, awesome systems.
Especially when they still have to sell. 2 consoles, both at $400?

Luckily, I have a 7970 in my laptop and steam at my back.
 

wsippel

Banned
Sorry, but this is just foolish. The CPU is still PowerPC architecture while both the PS4 and X720 will have AMD's Jaguar. That is more than a few CPU architecture generations apart, as PowerPC 750 wasn't even just the base for the Wii's CPU, but also the Gamecube's. There is only so much amping up the horse power will do for chipset architecure, and the PowerPC line is damned long in the tooth.
Makes no sense. Jaguar is a 2012 design. Espresso is a 2011 design. The architecture is decades old in either case.

The ram is 2 GB of DDR3-1600. At best Microsoft goes with the same, just 4 times as much. A generational leap in quantity if nothing else, though we'd be foolish to presume they stick with 1600 ram, especially given what Sony is doing, opting for 4 GB of GDDR5 which is double the Wii U's memory with massively superior chips. Sony at least is clearly taking a noteworthy generational leap forward here.
DDR3 is 2007 tech, GDDR5 is 2007 tech. Different performance characteristics for different applications, same generation. And more of the same isn't really a generational leap, either.

Most rumors that I've seen are that the Wii U's "Latte" GPU is based on Radeon 5000 series architecture, while the PS4 and X720 are both on at least 7000 series architecture. Another clear generational leap.
We have no idea how much R700 is left, though. And regardless of what it's based on, Latte was developed in 2011. As was - you guessed it - Southern Islands.
 
We have no idea how much R700 is left, though. And regardless of what it's based on, Latte was developed in 2011. As was - you guessed it - Southern Islands.

And ... ?

Low end low power 2011 part is not going to automatically be faster that parts from few years back.

Wii U GPU is slower than mainstream AMD parts from 2008 (4870/90)
 
Edit: also, PS4 reserving 512MB for OS... they can't expand later on, so they must feel comfortable with that number or else they would have reserved more and streamlined it later on. An indication that Nintendo can trim their OS while remaining "next-gen"?
What the hell is Nintendo even doing with the reserved OS memory now? I assumed it would mean things like Miiverse would be in the background all the time, but given it's far from instantaneous to switch to it that doesn't seem the case.
 

wsippel

Banned
And ... ?

Low end low power 2011 part is not going to automatically be faster that parts from few years back.

Wii U GPU is slower than mainstream AMD parts from 2008 (4870/90)
48xx was high end, 47xx was mainstream.

Anyway, who cares? Yes, newer doesn't necessarily mean faster. "More modern" doesn't equal "more powerful". That was part of the point I was trying to make. Southern Islands is several generations ahead R700, yet even a mainstream 4770 from 2009 easily beats entry level Southern Islands GPUs.
 
The God of War series has historically had massive reliance on pre-rendered backgrounds. It wasn't until God of War 3 on PS3 before the series saw much in the way of real time background details. It isn't a technical benchmark of anything, other than what very strong artistic design can do regardless of hardware.


what backgrounds where prerendered in god of war 1 and 2?

Ive only played the ps3 collection. But the only prerenderd was some cut scenes.
 

chaosblade

Unconfirmed Member
What the hell is Nintendo even doing with the reserved OS memory now? I assumed it would mean things like Miiverse would be in the background all the time, but given it's far from instantaneous to switch to it that doesn't seem the case.

This is what I've wondered since I got the system. It baffles me, because there is nothing indicating very much of that 1GB is being used for anything at all. There's no excuse for the entirety of the system software to not be running directly out of RAM with as much as they have dedicated to it.

I just can't help but wonder if it was literally thrown together at the last minute and is still just extremely poorly put together. Maybe next year this year we'll get a "proper" version of the OS that will be snappy? Who knows, it just doesn't seem like there is any excuse for the way things are now from a hardware perspective.
 

ozfunghi

Member
And ... ?

Low end low power 2011 part is not going to automatically be faster that parts from few years back.

Wii U GPU is slower than mainstream AMD parts from 2008 (4870/90)

lol

4890 was topdog, the highest spec'd card AMD had that generation. I know because i own it. There was no HD49xx. In some cases you could get better performance going with 2x4870, but in no way, shape or form were these "mainstream" AMD parts.
 

Lonely1

Unconfirmed Member
48xx was high end, 47xx was mainstream.

Anyway, who cares? Yes, newer doesn't necessarily mean faster. "More modern" doesn't equal "more powerful". That was part of the point I was trying to make. Southern Islands is several generations ahead R700, yet even a mainstream 4770 from 2009 easily beats entry level Southern Islands GPUs.

Eh, 47xx was hardly mainstream. It would be hard to find a PC with something as good as an 46xx at your local Bestbuy at the time.
 
Eh, 47xx was hardly mainstream. It would be hard to find a PC with something as good as an 46xx at your local Bestbuy at the time.

4770 was launched as a mainstream card in the first half of 2009 at a price of 100$. At least that's what many reviewers called it (just google a bit), and 100$ is also what I'd call mainstream (as opposed to "entry level" or "low end" which are around 50$).
 
So have we accepted the possibility that the next Xbox will have Esram or Edram assuming rumors are true about its slower main ram compared to Sony chosen memory type. If so why, some refuse to believe this is the case for WiiU.

One question I have, is texture resolution, bandwidth restricted or is it just a the amount of memory available?

I'm wondering how much memory bandwidth is needed for texture resolution, beside bandwidth intensive shaders.
 

Spongebob

Banned
So have we accepted the possibility that the next Xbox will have Esram or Edram assuming rumors are true about its slower main ram compared to Sony chosen memory type. If so why, some refuse to believe this is the case for WiiU.

One question I have, is texture resolution, bandwidth restricted or is it just a the amount of memory available?

I'm wondering how much memory bandwidth is needed for texture resolution, beside bandwidth intensive shaders.
It's already been confirmed that it has ESRAM (32MB likely).
 
Sure, because MEM1 is no framebuffer.



The MEM1/ MEM2 hierarchy is an obvious evolution of the main RAM/ aux RAM concept. ARAM/ MEM2 became much bigger, but also much faster over the years compared to MEM1, but the idea still is to have very fast memory for highly performance critical stuff and "mass storage" for everything else. Is 32MB enough? No idea, but MEM2 certainly doesn't need to be bigger than MEM1 with current workloads.
Good point. An example of this is that the ESRAM in Durango (which may actually be used the same way the eDRAM in the Wii U) is also reported as 32MB too.
 

v1oz

Member
How did you come to that conclusion? The Wii U sounds more like a 2006 multicore Gamecube to me.
The GPU and the RAM amounts are a bit beyond what beyond what a "2006 Gamecube" would have been. Even the CPU has at least a few modern parts in it.
 
It's already been confirmed that it has ESRAM (32MB likely).


LOL, so my thoughts on shaders not needing a large amount memory other than access to a lot of bandwidth could be correct. This 32MB would fetch shader data from the disc or hard drive then processed by the GPU after receiving this data at crazy high bandwidth speeds.
 

ozfunghi

Member
How did you come to that conclusion? The Wii U sounds more like a 2006 multicore Gamecube to me.

Yeah? With 2GB of RAM (no way Nintendo would have used 1GB for OS in 2006) a +/-500Gflop GPU with DX10.1+ feature set? Nah.

If we're going by yearcount a 2009/2010 Gamecube perhaps.
 
4770 was launched as a mainstream card in the first half of 2009 at a price of 100$. At least that's what many reviewers called it (just google a bit), and 100$ is also what I'd call mainstream (as opposed to "entry level" or "low end" which are around 50$).

low end pc gpus <100$
mainstream 100-200$

Well anyway:

Wii U GPU < 4770 -

4770 - 640 sp @750 Mhz
7970m - 1280 sp @ 850 Mhz

So depending on clock speeds PS4 gpu will be 2-3 times more powerfull
 

Schnozberry

Member
The difference between Orbis and Wii U is looking to be far in excess of that from PS2 to Xbox. Unless you think Wii U can approach this performance with visuals of this videlity:

http://www.youtube.com/watch?feature=player_embedded&v=WWyXWSsg5LU
http://www.youtube.com/watch?v=4dw9jZ8CuE8&feature=player_embedded

You think Durango and Orbis will have the same computational power of a 7970 coupled with a high end Intel CPU? Dude, you need to hook me up with some of your drugs.
 

ozfunghi

Member
low end pc gpus <100$
mainstream 100-200$

You mean low end & mainstream for gaming purposes, perhaps.

Mainstream would mean whatever the majority of PC users has in its PC. And that were not, nor has that ever been, 100-200$ GPU's. Maintream would likely be 40-70$. If that.

We're back to a ~600 GFLOPS Wii U now? Sheesh...

I think the last serious discussion and speculation based on vague info, were close to 500 Gflops. Something like 460, which i have been guessing for the past 4 months.
 
Well, I think WiiU's GPU will be more in the 200-250GFLops ballpark. I mean, Xenos on Xbox360 is more or less 240GFLops, and can run the same games that can be found on the WiiU but at much better framerates, so it has to be inferior to that.

Regards!
 
low end pc gpus <100$
mainstream 100-200$

Well anyway:

Wii U GPU < 4770 -

4770 - 640 sp @750 Mhz
7970m - 1280 sp @ 850 Mhz

So depending on clock speeds PS4 gpu will be 2-3 times more powerfull
The several rumors has consistently clocked the PS4 at 18cus @ 800MHz. I believe that 1 cu = 64 sp, so the PS4 would have 1152 sp.
If we are comparing the gpus of the Wii U and PS, though, we have to remember that the PS4 suppose to have some other hardware components to help with its graphic output.
 

ozfunghi

Member
Well, I think WiiU's GPU will be more in the 200-250GFLops ballpark. I mean, Xenos on Xbox360 is more or less 240GFLops, and can run the same games that can be found on the WiiU but at much better framerates, so it has to be inferior to that.

Regards!


I have a feeling you won't be around for long.

Games like Mass Effect 3 look as good as they do on 360, because devs have been working on the platform for 7 years, have it as the lead platform for most multiplat games, know the hardware inside out. And yet, the WiiU versions, ports, of such games do indeed run "somewhat" worse, in certain cases better than PS3 still, while these are launch games. You can be sure WiiU games will look a whole lot better than these games in the next couple of years. But ignorance is bliss i suppose.
 

chaosblade

Unconfirmed Member
Well, I think WiiU's GPU will be more in the 200-250GFLops ballpark. I mean, Xenos on Xbox360 is more or less 240GFLops, and can run the same games that can be found on the WiiU but at much better framerates, so it has to be inferior to that.

Regards!

That's not really a fair comparison, since one is a brand new console getting rushed ports and the other is years old with mature tools and lots of dev experience.

The several rumors has consistently clocked the PS4 at 18cus @ 800MHz. I believe that 1 cu = 64 sp, so the PS4 would have 1152 sp.
If we are comparing the gpus of the Wii U and PS, though, we have to remember that the PS4 suppose to have some other hardware components to help with its graphic output.

That "compute module" would be for things like physics, which would help out with graphics just because the GPU wouldn't have to be used for tasks offloaded from the CPU.
 
Well, I think WiiU's GPU will be more in the 200-250GFLops ballpark. I mean, Xenos on Xbox360 is more or less 240GFLops, and can run the same games that can be found on the WiiU but at much better framerates, so it has to be inferior to that.

Regards!
Numerous sources reported that the Wii U has a superior GPU. Your deductions are flawed due to comparing launch title ports developed by small teams to 7th-gen 360 games.
 
That's not really a fair comparison, since one is a brand new console getting rushed ports and the other is years old with mature tools and lots of dev experience.



That "compute module" would be for things like physics, which would help out with graphics just because the GPU wouldn't have to be used for tasks offloaded from the CPU.
Perhaps that was for Durango. I will have to reread that Eurogamer article.
 
I have a feeling you won't be around for long.

Games like Mass Effect 3 look as good as they do on 360, because devs have been working on the platform for 7 years, have it as the lead platform for most multiplat games, know the hardware inside out. And yet, the WiiU versions, ports, of such games do indeed run "somewhat" worse, in certain cases better than PS3 still, while these are launch games. You can be sure WiiU games will look a whole lot better than these games in the next couple of years. But ignorance is bliss i suppose.
Well, I hope so because I have a WiiU and I'm enjoying it a lot. If it turns to be better than current gen, then good for me.

About being around for long... what do you mean, can I get banned from this site only because I'm a bit sceptical about the WiiU capacities? O_O
 

ozfunghi

Member
Well, I hope so because I have a WiiU and I'm enjoying it a lot. If it turns to be better than current gen, then good for me.

About being around for long... what do you mean, can I get banned from this site only because I'm a bit sceptical about the WiiU capacities? O_O

Not for being sceptical.
 
Top Bottom