• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

RIGHT, which is why I have set aside 512MB ram, and said that it's even possible to get more out of that pool if it's optimized more so, personally I do not know how much ram Netflix on 360 requires, but obviously it can't be more than 512MBs, if it's less and that is the most RAM intensive thing the Wii U does for instance, then that 512MBs pool can be tapped into. (also saying that it's quite reasonable to assume that Wii U could tap into 1.5GB of ram atm if the app software was on par with 360.

But Wii U will do multitasking, aka more than an app in ram. OS and system services will need some ram (less than 1gb), and the rest for apps (like Netflix, web explorer and so).

This is that thing about how a 360 game used an entire core for sound isn't it?"

A thread, each core (of three) have 2 threads.
 

z0m3le

Banned
As it's been pointed out, it's too early to say how they really compare. However it is safe to say it will be easier to port from the ps4/720 to the Wii-U than it was from the ps360 to the Wii.

With the ps360 we had multi-core and threaded CPUs, GPUs with prgrammable shaders, and 512MB of total memory while the Wii had a single core/thread CPU, fixed function GPU, and 88MB of memory. So you can imagine how difficult it would have been to port an engine designed to support multiple threads and programmable shaders over to the Wii which featured neither.

Now with the next gen, all 3 systems will support multiple threads and have programmable shaders to play around with. So no matter the gap between the ps4/720 and Wii-U, it will still be easier than what we saw this gen.



We really can't say this for sure.

PS2 was a 6GFLOPs console, XBox was a 21GFLOPs console, the difference is greater than we are hearing from Wii U to PS4, which is ~600GFLOPs to 1843GFLOPs. So while we can't say anything for sure (we still don't know even Wii U's performance atm) it's about the only thing we can talk about since we have to take these rumors as the base for our speculation.
 

Zabant

Member
With specs like these I can now fully understand why all their marketing and former press conferences were "HEY CHECK OUT OUR CONTROLLER"
 

v1oz

Member
This is that thing about how a 360 game used an entire core for sound isn't it?

I highly doubt most games do, if even many...

But someone more in the know can correct me.
Indeed. That would be poor use of resources; a single hardware thread makes sense, but an entire core dedicated to sound is wasteful. Especially given that a single core has much more processing power than the Gamecube or Xbox CPU.
 

Quaz51

Member
PS2 was a 6GFLOPs console, XBox was a 21GFLOPs console, the difference is greater than we are hearing from Wii U to PS4, which is ~600GFLOPs to 1843GFLOPs. So while we can't say anything for sure (we still don't know even Wii U's performance atm) it's about the only thing we can talk about since we have to take these rumors as the base for our speculation.

6Gflops is the PS2 CPU, very bad comparison
this type of comparison isn't possible on old gen
 

v1oz

Member
But Wii U will do multitasking, aka more than an app in ram. OS and system services will need some ram (less than 1gb), and the rest for apps (like Netflix, web explorer and so).



A thread, each core (of three) have 2 threads.
Really... Has it been confirmed that you can multi task whilst running games. E.g. pausing a game to check something on the web?

If any console ever needed to have upgradable RAM, its the Wii U. That worked really well with the N64 you could double the on board Ram with relative ease. No one has ever tried that idea again since.
 
Really... Has it been confirmed that you can multi task whilst running games. E.g. pausing a game to check something on the web?

If any console ever needed to have upgradable RAM, its the Wii U. That worked really well with the N64 you could double the on board Ram with relative ease. No one has ever tried that idea again since.
It would be cool if someone making a cheap underpowered machine like this one will support multi gpu rendering by connecting 2 consoles.

Since maybe they don't lose money in each machine sold the feature could be sustained, only hurdle i see is launch time because of the hardware availability.
 

z0m3le

Banned
6Gflops is the PS3 CPU, very bad comparison
this type of comparison isn't possible on old gen

They have a more similar architecture than ps2 and xbox as well. ram was 32mb to 64mb as well, I'm not sure how you could compare these consoles then. How exactly would you compare them then? beyond you know one is Nintendo and another is Sony.
 
PS2 was a 6GFLOPs console, XBox was a 21GFLOPs console, the difference is greater than we are hearing from Wii U to PS4, which is ~600GFLOPs to 1843GFLOPs. So while we can't say anything for sure (we still don't know even Wii U's performance atm) it's about the only thing we can talk about since we have to take these rumors as the base for our speculation.

More flops, yes, but with contemporary architecture. It is like comparing 2 DX10 GPUs, low end and high end, more resolution, better framerate, some extra polygons, some better textures, but the "same" architecture.

It will depend if MS and Sony make a higher level architecture than Wii U, If PC, Xbox 720 and PS4 can do a higher jump "together" (DX11++/DX12), those 1843GLOPS will not mean anything.

If Xbox 720 and PS4 are "current DX11" (actual PC level), I guess Wii U will get most ports.
 

z0m3le

Banned
More flops, yes, but with contemporary architecture. It is like comparing 2 DX10 GPUs, low end and high end, more resolution, better framerate, some extra polygons, some better textures, but the "same" architecture.

It will depend if MS and Sony make a higher level architecture than Wii U, If PC, Xbox 720 and PS4 can do a higher jump "together" (DX11++/DX12), those 1843GLOPS will not mean anything.

If Xbox 720 and PS4 are "current DX11" (actual PC level), I guess Wii U will get most ports.

Xbox used "DX8" while PS2 was more in line with DX7. So I'm not sure what you are talking about...
 

v1oz

Member
It would be cool if someone making a cheap underpowered machine like this one will support multi gpu rendering by connecting 2 consoles.

Since maybe they don't lose money in each machine sold the feature could be sustained, only hurdle i see is launch time because of the hardware availability.
How'd you hook them up? Using ethernet cables or something.
 
With specs like these I can now fully understand why all their marketing and former press conferences were "HEY CHECK OUT OUR CONTROLLER"

The butt hurt man, I've never seen so much in one place at one time. I'm new to NeoGAFii but has anyone ever seen it like this or maybe even worse?
 

Quaz51

Member
maybe GPU flops is like:
X360 > x2 > WiiU > x3 > PS4
but ROP (fillrate) is more like:
X360 > x1.2 > WiiU > x5.3 > PS4
or CPU (assumption)
X360 > x0.8 > WiiU > x4 > PS4
 
How'd you hook them up? Using ethernet cables or something.
That would be the least of the problems. But i was thinking since the hypothetical unit would be small (and the target is a power user) a sort of really nice looking rack with some fans for extra cooling, sold with a good profit margin. Just look how nice some of the multi HDD enclosures look. Or the cheap way with a cable like with AMD crossfire.

This can be really cool, because if the asoptionrate was high enough. Maybe some years down the line they could release the multi GPU system in the same box, sort of as a relaunch and keep the brand fresh.
 

z0m3le

Banned
maybe GPU flops is like:
X360 > x2 > WiiU > x3 > PS4
but ROP (fillrate) is more like:
X360 > x1.2 > WiiU > x5.3 > PS4
or CPU (assumption)
X360 > x0.8 > WiiU > x4 > PS4

Right, and shots in the complete dark about this is better than the rumors we have? we can speculate on a lot of things, but saying that the Samsung game console that comes out in 2014 will make all 3 obsolete is pointless.
 
One thing for certain is that it going to be easy to compare GPUs for all 3 systems .
Yeah each one going to get some tweaks but they all made by AMD .
 

TheD

The Detective
Apparently most 360 games use a sixth of the CPU power for sound and many use a third

By 'many use a third" you mean 1, a game I might add that had over 200 sounds sources at the same time.

You don't know what you are talking about, sorry.

DX10.1? antonz has already pointed out is incorrect, SM4? Wii U has already exceeded from those specs, and they are just a basic baseline for devs to know are there. For instance GPGPU support was pretty much introduced with DX11, at least unified shader support for it.

NO!

SM4.1 is not the same as SM5!
All GPUs have things that exceed the DX Shader Model they support!
That does not mean they support a higher SM level.

GPGPU has been done for years!
OpenCL is supported by DX10 cards.
DX compute shaders work on DX10 hardware.
Folding@Home has even used GPU acceleration on old ATI x1900s!
The 360 support using the GPU as a GPGPU via the memextport.
ect.
 

DonMigs85

Member
Not sure how you can compare X box and PS2 to begin with .
Since the PS2 also used both of parts for GFX.
That 6 GFLOPS you talking about for the CPU .

The Reality Synthesizer was more of a graphics accelerator and couldn't do floating point on its own. Still a big jump from Dreamcast which had a theoretical max of 1.4 GFLOPs.
GameCube was also sandwiched right between PS2 and Xbox in this aspect (10.5 GFLOPs, 1.9 of which is from the Gekko CPU).
 

TheD

The Detective
The Reality Synthesizer was more of a graphics accelerator and couldn't do floating point on its own. Still a big jump from Dreamcast which had a theoretical max of 1.4 GFLOPs.
GameCube was also sandwiched right between PS2 and Xbox in this aspect (10.5 GFLOPs, 1.9 of which is from the Gekko CPU).

The Reality Synthesizer is the PS3's GPU.

You are thinking of the "Graphics Synthesizer".
 

z0m3le

Banned
By 'many use a third" you mean 1, a game I might add that had over 200 sounds sources at the same time.



NO!

SM4.1 is not the same as SM5!
All GPUs have things that exceed the DX Shader Model they support!
That does not mean they support a higher SM level.

GPGPU has been done for years!
OpenCL is supported by DX10 cards.
DX compute shaders work on DX10 hardware.
Folding@Home has even used GPU acceleration on old ATI x1900s!
The 360 support using the GPU as a GPGPU via the memextport.
ect.

read that post slower, YES compute graphics card is nothing new, and gpgpu functionality can be found in older cards, but unified shaders for compute was done in DX11. Tessellation was another that was moved over to unified shaders, both of these pretty much cover the change from DX10.1 and DX11, of which the Wii U uses neither.

And I didn't say it was SM5, I said it exceeded shader model 4 (via antonz), you are the one jumping the gun and declaring what that means.
 
2 gigs of RAM, huh? I'll have my crow medium rare please.

I still believe the reports that it's clocked low. I'm going with a 1440 Mhz processor, 480 Mhz gpu, and 720 Mhz DDR3 configuration.

The big question is that eDRAM. The console seems to rely heavily on it. What's the bandwidth to the cpu and gpu? What kinda latencies are we talking? Below 2ns?
 
Kudos to Ideamon mon mon and bg!

Thanks. I'd like to thank all the messageboard posts I've read this past year. I wouldn't be here without you.

Makes you wonder if Iwata (or his team of Ninja's) reads these forums, given that he just addressed a few things that the GAF collective have been bitching about for months now.

LOL. Same thing I was thinking. Those were rather specific mentions.

Not me. From the get go I surmised 2GB, and no one believed me :) it seemed to make sense. Anyway, all good news and let's hope they unlock more down the road when the other consoles come out. Maybe then they will unlock the second GPU....

Yep you were. At the time it didn't dawn on me that the info I was seeing back then referred to 1 or 1.5GB being for games, not total.

Hopefully sooner rather than later. This is good news for me overall. It's better than having only 1.5GB and nearly no room for updates that partition more ram for the games.

I agree on both accounts. This gives them future flexibility, but hopefully not the distant future.
 

z0m3le

Banned
PS2 is not "DirectX". PS2 don't have "shaders" as Xbox, but it can be programmable for things like selfshadows or motion blur.

http://teamico.wikia.com/wiki/Graphics_and_post-processing_effects_used_in_Shadow_of_the_Colossus

Xbox used "DX8" while PS2 was more in line with DX7. So I'm not sure what you are talking about...

What I meant was that PS2 effects were in line with what we saw from DX7, compare PS2 games to XBox games and you will see effects missing on a constant basis, sometimes it's because the PS2 wasn't powerful enough for it, and other things were simply only possible thanks to the nearly 2 extra years that Xbox spent in development.
 
PS2 is not "DirectX". PS2 don't have "shaders" as Xbox, but it can be programmable for things like selfshadows or motion blur.

http://teamico.wikia.com/wiki/Graphics_and_post-processing_effects_used_in_Shadow_of_the_Colossus
Equating the PS2 architecture with anything DX is insane in the first place.

If you wanted pixel-shading beyond simple emboss or glossy maps it was almost purely software oriented. But because of that versatility was also much higher than the DX8+ GPU of the Xbox. Depending on the ops performed though you really could cap your poly count.
 

nordique

Member
maybe GPU flops is like:
X360 > x2 > WiiU > x3 > PS4
but ROP (fillrate) is more like:
X360 > x1.2 > WiiU > x5.3 > PS4
or CPU (assumption)
X360 > x0.8 > WiiU > x4 > PS4

possibly

but just to point out we've heard from some people like wsippel that, at the bare minimum the Wii U CPU is "at least" as capable as the 360 CPU

It is getting a lot of criticism because it has a purportedly lower clock than the 360, but by all accounts, it is still quite capable and efficient
 
The butt hurt man, I've never seen so much in one place at one time. I'm new to NeoGAFii but has anyone ever seen it like this or maybe even worse?

What about his post was "butthurt"? The specs really are that bad. He's probably right about that being the reason they've put so much focus on the controller as it's the centerpiece of the system.
 
What about his post was "butthurt"? The specs really are that bad. He's probably right about that being the reason they've put so much focus on the controller as it's the centerpiece of the system.
Bad is not quantifiable.

Disappointing sure. But you could call Orbis specs bad in comparison to potential.
 
Xbox used "DX8" while PS2 was more in line with DX7. So I'm not sure what you are talking about...

What I meant was that PS2 effects were in line with what we saw from DX7, compare PS2 games to XBox games and you will see effects missing on a constant basis, sometimes it's because the PS2 wasn't powerful enough for it, and other things were simply only possible thanks to the nearly 2 extra years that Xbox spent in development.

No, you can´t say PS2 was "DX7 like", Dreamcast was DX7-like, but PS2 can do a lot of things that DX7 can't, and it can do a some things that Xbox can, even without shader language.
 

Hcoregamer00

The 'H' stands for hentai.
PS2 is not "DirectX". PS2 don't have "shaders" as Xbox, but it can be programmable for things like selfshadows or motion blur.

http://teamico.wikia.com/wiki/Graphics_and_post-processing_effects_used_in_Shadow_of_the_Colossus

Heck, the Wii U is the first Nintendo console to have a fully programmable GPU, Even the 3DS doesn't have that. That alone should mean that ultimately the console will not be locked out of 3rd party multiplat games like last generation.
 

Donnie

Member
It will easily happen, the Wii U is not that powerful.

It has double the RAM (available to games) than a 7 year old machine, that's low.

A tri core CPU that is not much more (if at all) faster than a 7 year old machine, that's not so good.

and a GPU based on a chip from 5 years ago (with a few bells and whistles), not great for a "next gen" machine.


Wii U is overpriced for the tech, tech that is not that impressive when compared to a 7 year old machine that cost $299 (arcade) at launch.


Believe it when people say that the next MS and Sony machines will be vastly better in terms of technology than the Wii U.

Nintendo have done a 3DS, cheap hardware that is overpriced.

You really don't know what WiiUs CPU is. PS3 had 392MB of RAM for games on release (can't remember 360s). Oh and finally r700 which the GPU is based on was released 4 years ago not 5. At least get the facts straight if you're going to post this kind of thing.

I'll also remind you that plenty of new hardware is based on older hardware, most of it in fact. There's no reason for Nintendo to advance the GPU to an exact DX11 specification since a lot of that may not suit how they want the GPU to work, they wont even be using DX. Instead they've took a DX10.1 GPU as a starting point and added features they want, it's a custom GPU now. For instance if you think any GPU 4 years back had 32MB of super fast embedded RAM then I'd love for you to point it out.
 

z0m3le

Banned
No, you can´t say PS2 was "DX7 like", Dreamcast was DX7-like, but PS2 can do a lot of things that DX7 can't, and it can do a some things that Xbox can, even without shader language.

Right, that was what the second paragraph was describing, PS2 wasn't capable of a lot of effects that Xbox was, the entire point of what I was saying is that is exactly like the situation people are painting about Wii U and PS4/XB3, except DX10.1 can do vitually everything DX11 can.
 

Donnie

Member
No, you can´t say PS2 was "DX7 like", Dreamcast was DX7-like, but PS2 can do a lot of things that DX7 can't, and it can do a some things that Xbox can, even without shader language.

He's referring to the GPU though. GS was really very basic as far as its feature set goes so basic in fact its not even really accurate to call it a GPU. Since it couldn't perform hardware T&L. That said it was created to compliment the EE (CPU) so HW T&L wasn't a big necessity given that EE was very good at geometry processing. But still GS lacked a lot of features beyond DX6. Where it did do well was brute force work. Massive bandwidth in the internal frame buffer and a quite insane pixel fillrate for the time allowed some nice effects for developers that were prepared to really put the work in. But feature set wise for GS I'd say DX7 like is being very generous.
 

Donnie

Member
What about his post was "butthurt"? The specs really are that bad. He's probably right about that being the reason they've put so much focus on the controller as it's the centerpiece of the system.

Are the specs you're referring to the ones you've imagined? Because enhanced Broadway cores isn't a spec. Based on a certain GPU isn't a spec.
 

Donnie

Member
Dat OS footprint though. It's crazy to think Nintendo felt they need half the memory just for the OS.

I assume some of that is to make sure the OS has plenty of memory to begin with (before they trim it down as they decide on its final features) and the rest is for apps. PS3 used about 25% of its memory for the OS on release. Now it uses something like 8%. So hopefully a good portion of that 1GB ends up being freed up for games.
 
Very.

And thanks bg for stayin' cool and calm and doing the detective work on all this Wii U stuff. Gave us many hours of reading and speculation.

I enjoyed doing it. Trying to learn what the console could be and could do while waiting on Nintendo to give us (some) real info really helped pass the time. Hard to believe over a year has passed. Seems so quick. I think today has shown it was worth the wait.

Exceeding DX10.1 should be expected, similar to how PS360 exceed DX9 spec.

PS4 and 720 should exceed the DX11.1 spec too.

Agree on all accounts. PS4 target specs have already indicated this.
 
Top Bottom