• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RUMOUR: Wii U specs leaked?

Status
Not open for further replies.

charsace

Member
Way better than I thought based off of all the, "slightly more powerful than 360/PS3." This is more than slightly. Next ps/xbox will have even better specs than this.
 

Raistlin

Post Count: 9999
I'm confused. I thought the CPU was supposed to have a 'large amount' of eDRAM (essentially L3 cache) for core inter-communication?

Now it's the GPU with eDRAM ala Xenos (though thankfully more)?







USB2? Ugh.
To what end exactly? You're not going to be copying large files to and from the console regularly. Assuming Wii U can even play back media files, USB 2.0 is more than enough for realtime buffering from a thumbdrive or external HDD.
 

Orayn

Member
Seeing as I know nothing about console development, is there a reason they use so little RAM?

Or is my bloated OS the reason my PC uses so much?

That's part of it. The RAM they use is also more akin to what's in a video card, so they don't need as much of it.
 
Well Zombi U is the only thing that looked better than current gen but it did look a lot like CG so I'm not sure. Definitely not impressed by the graphics of the games shown which make me wary of the GPU specs not revealed on this spec sheet.
 

Raistlin

Post Count: 9999
We are, in all likelihood, talking about memory that's much faster and more expensive than your phone's.
I think you're missing his point lol


demigod was confused by the seemingly strange number of 1.5GB RAM. The reality is that the individual RAM chips are going to be either 256 or 512MB depending on the type of RAM chosen ... which obviously allows such 'weird' numbers to exist (obviously they're rounding in the typical 1kB = 1024B fashion).

Magic Ovaries was simply pointing out an example where a product has this sort of strange RAM amount.
 

v1oz

Member
Well Zombi U is the only thing that looked better than current gen but it did look a lot like CG so I'm not sure. Definitely not impressed by the graphics of the games shown which make me wary of the GPU specs not revealed on this spec sheet.

Was it better than current gen? I can't tell. It definitely didn't "wow" me at all though.


...
 

Bombadil

Banned
1.5 gigs of RAM?

Not bad, but I guess Nintendo is banking on MS and Sony doing likewise.

I'm not too impressed.

I'm impressed by the efficiency, but not by the overall power.
 

aeroslash

Member
So after seeing the press conference, do you think this specs are real?

I'm sure haven't seen anything clearly above ps360.

Only zombiu seemed nice, but i'm afraid ubi has made a "redsteel" again here...so i'm not commenting until watching some gameplay..
 

Raistlin

Post Count: 9999
What do you mean, like a sign that there's too little general bandwidth? I don't think it probaly matters that much in the end. 32MB is enough for 1080p buffer, and with some postproc-aa that should yield excellent picture quality.
Whether you can easily do post-processing like that depends on whether it can actually use main RAM as a frame buffer and output from it ala PS3. That actually is not the case for 360 which is one of the console's 'problems'. Even if it can though, how useful it is depends on the BW/speed of the main RAM.

So if it can't use main RAM or it's deemed too slow for practical purposes, it will be similar to the 360 situation where you need to do more than one eDRAM pass. Obviously that complicates development and has performance considerations ... which is why some devs simply don't do it.


I think the reality though is that most games will target 720p, which means this actually is plenty of eDRAM for effects and the like. Granted for output to the Wii U screen there will be some memory being used ... but I'd imagine that would have little to no filtering/processing and is of a limited resolution. Shouldn't really take up all that much space.
 

charsace

Member
Whether you can easily do post-processing like that depends on whether it can actually use main RAM as a frame buffer and output from it ala PS3. That actually is not the case for 360 which is one of the console's 'problems'. Even if it can though, how useful it is depends on the BW/speed of the main RAM.

So if it can't use main RAM or it's deemed too slow for practical purposes, it will be similar to the 360 situation where you need to do more than one eDRAM pass. Obviously that complicates development and has performance considerations ... which is why some devs simply don't do it.


I think the reality though is that most games will target 720p, which means this actually is plenty of eDRAM for effects and the like. Granted for output to the Wii U screen there will be some memory being used ... but I'd imagine that would have little to no filtering/processing and is of a limited resolution. Shouldn't really take up all that much space.

360 eDRAM wasn't designed with deferred rendering in mind.

I'm not good at these things, but I think you would need 24mb of eDRAM to do deferred rendering at 1080p without AA.
 

Raistlin

Post Count: 9999
360 eDRAM wasn't designed with deferred rendering in mind.

I'm not good at these things, but I think you would need 24mb of eDRAM to do deferred rendering at 1080p without AA.
It isn't an issue that only occurs with deferred rendering though. There just isn't enough room for multiple buffers to do certain types of effects. This is why Unreal engine games didn't have AA, etc?
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
You could definitely see the power in Nintendo's E3 conference. Really. It was spectacular.
 

charsace

Member
It isn't an issue that only occurs with deferred rendering though. There just isn't enough room for multiple buffers to do certain types of effects. This is why Unreal engine games didn't have AA, etc?

You're right. Then HDR might not be possible at 1080p internal res.
 
I dont see why specs matters.

I mean. Not even nintendo is putting any effort in it.

Depending on the demographich they don't matter. I am on a tight budget for 1 home and 1 portable console. So I will get the one with the best third party AAA support. This generation it was a PS3 - could easily have been a 360 but not a Wii (I do own one hehe).

Next generation is not yet decided but if Sony and MS go overboard with power that leaves the Wii-U out of the race for me. In that case specs matter because I want AC5 to 7 and GTA6-8 not GTA: WiiU. I like Nintendos approach and it certainly will work but just not with me.
 

Durante

Member
Those specs seem entirely believable, and in line with what I read elsewhere previously.

However, they are still too incomplete to allow any decently accurate performance estimation.
 

muu

Member
And the audio part still sucks, although we'd have to see what can be done through analog. This could be the first 5.1 device I wouldn't be able to use to get 5.1 on my receiver. Which is rich, given Nintendo's multichannel support so far.

Xbox threw away TOSLINK support for their later revisions for cost-cutting purposes as well. Doesn't seem completely out of line, considering newer receivers fully support whatever version of HDMI they're on now.
 

AlStrong

Member
360 eDRAM wasn't designed with deferred rendering in mind.

That's a weird thing to say since resolution is arbitrary and so is a G-buffer setup. ;) The 360 can do deferred rendering/shading/lighting just fine. If it weren't meant for more than 10MB render targets, then they wouldn't have had tiling implemented in the first place, so your statement is wrong. It's just that most devs would rather not have to deal with the extra performance costs.


I'm not good at these things, but I think you would need 24mb of eDRAM to do deferred rendering at 1080p without AA.

Each 32-bit (e.g. RGBA8 or depth) texture @ 1080p is ~8.3MB/7.91MiB. Currently the fattest G-buffer renderer uses 4 MRTs with a 32-bit depth/stencil, so that'd be 41.2MB/39.55MiB for the G-buffer pass. Of course, there are various mini G-buffer approaches depending on what the devs choose to store in the various texture channels.

Double the space for FP16 or multiply by NxMSAA.

It isn't an issue that only occurs with deferred rendering though. There just isn't enough room for multiple buffers to do certain types of effects. This is why Unreal engine games didn't have AA, etc?

Plenty of UE3 games used 2xMSAA, but as a forward renderer, they could choose when to enable or disable it for certain render passes (bloom, HDR lighting, transparencies). Proper MSAA shading/lighting simply costs more to do, so devs generally skipped it.
 

Popstar

Member
Well, you probably want to do your rendering at higher precision rather than just expanding INT8 to FP16...

Well, it's going to depend on your exact method. There's no reason you can't do HDR with RGBA8 if you do your tone mapping in the shader.

(1920*1080*32bit = ~7.91MB btw, so they can just squeeze 4 buffers into 32MB)
 
Xbox threw away TOSLINK support for their later revisions for cost-cutting purposes as well. Doesn't seem completely out of line, considering newer receivers fully support whatever version of HDMI they're on now.
They only half did. No TOSLINK out of the box, I understand. They still used a compression format which made TOSLINK use possible. Supporting only PCM (probably to cut on costs) seems inane to me.
As for the newer receiver argument, it could make sense if hardware/content providers had dropped support for older formats such as vanilla DD or DTS, which they haven't.

Doing this is akin to supporting only SD resolutions or 1080p and telling people with 720/768 line displays to play in SD or upgrade.
 

v1oz

Member
bulk of WiiU games at E3 conf was 720p no AA

From scrutinizing the exclusive games like Pikmin and Zombie U. I'm starting to side with the more conservative estimates about WiiU horsepower. Clearly if this machine has more power and a more modern feature set than current gen, we would be seeing better IQ than we are used to even with launch titles.

Going back 10 years. When Luigi's Mansion was unveiled at Spaceworld it was clearly a step up from everything else - even Dreamcast.
 

zbarron

Member
It says it only supports 6 channel PCM but since HDMI is easily capable of 7.1 or more is there anything that would actually hold back a game from supporting 7.1 or is it a possibility in the future? That is the only thing I'm bummed about. The PS3 has been capable of 7.1 for the past 6 years.
 
Status
Not open for further replies.
Top Bottom