• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U has 2GB of DDR3 RAM, [Up: RAM 43% slower than 360/PS3 RAM]

This all just justifies my decision to wait. For me, this was always going to be nothing but a Nintendo Games Box ™

I won't pay launch money for that, especially when the games aren't there yet. But 2-3 years from now I'll be glad to jump in and play the Nintendo games in HD. It's a shame the console can't be much more than that in this respect, but what can you do. I was going to get the PS4 anyway (barring huge fuck up). In the end though, Nintendo being Nintendo. No one should be surprised. Also I don't think I agree with people trying to downplay the likely difference between wiiU/PS4 and 720. Yeah it won't be something like HD vs non HD of this gen. But it'll potentially be fucking huge. And never underestimate developers. Just because engines are scalable doesn't mean they will, and if they do it doesn't mean they'll be remotely good compares to the rest. That's assuming 3rd party games sell a lot on wiiU to begin with.

Lots of interesting times ahead. The next year/year and a half are going to be hilarious/scary/fascinating.
 

gogogow

Member
Uhhh... wow, you really are wrong in every way in this post.

First off, if you think that everything gets drawn, even if it is offscreen, you're insane. There are tons of visibility determination algorithms that efficiently determine what should and should not be drawn. So no. And if the water is occluded by an opaque surface, I don't believe that tesselation would actually be applied because visibility determination occurs before the tesselation stage.

Secondly, I'm not sure you fully understand tesselation, so it's probably not a good idea to talk about performance regarding something you don't understand.
Talking about yourself?
Don't talk shit when YOU don't know shit, junior.

true-water-full-620.jpg


true-water-mesh-620.jpg


That's right. The tessellated water mesh remains in the scene, apparently ebbing and flowing beneath the land throughout, even though it's not visible.
The GPU is doing the work of creating the mesh, despite the fact that the water will be completely occluded by other objects in the final, rendered frame.
That's true here, and we've found that it's also the case in other outdoor areas of the game with a coastline nearby.

Obviously, that's quite a bit needless of GPU geometry processing load. We'd have expected the game engine to include a simple optimization that would set a
boundary for the water at or near the coastline, so the GPU isn't doing this tessellation work unnecessarily.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/3
 

chaosblade

Unconfirmed Member
Uhhh... wow, you really are wrong in every way in this post.

First off, if you think that everything gets drawn, even if it is offscreen, you're insane. There are tons of visibility determination algorithms that efficiently determine what should and should not be drawn. So no. And if the water is occluded by an opaque surface, I don't believe that tesselation would actually be applied because visibility determination occurs before the tesselation stage.

Secondly, I'm not sure you fully understand tesselation, so it's probably not a good idea to talk about performance regarding something you don't understand.

You should look up the Crysis 2 tessellation stuff. It's ridiculous.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing

Edit: Ha, beaten.
 
PS3's memory is split into 2 pooks, 256MB with 25.6GB/s and the other 256MB with 22.4GB/s. 360 is one 512MB pool at 24.4GB/s.

But 360's 10MB eDRam makes up for the lack of bandwidth in its main memory and in most cases more than makes up for it. But its eDRam isn't nearly as good as WiiU's is going to be due to the fact that its on a seperate die (rather than on-die like WiiU's) and due to its size (only 10MB vs 32MB for WiiU).

And, for instance, the CPU can access the XDR at the same time the GPU is accessing the GDDR3? Yeah, I see your point now. Forgot that there were actually benefits to the split pool approach.
 

Reiko

Banned
Uhhh... wow, you really are wrong in every way in this post.

First off, if you think that everything gets drawn, even if it is offscreen, you're insane. There are tons of visibility determination algorithms that efficiently determine what should and should not be drawn. So no. And if the water is occluded by an opaque surface, I don't believe that tesselation would actually be applied because visibility determination occurs before the tesselation stage.

Secondly, I'm not sure you fully understand tesselation, so it's probably not a good idea to talk about performance regarding something you don't understand.

Now it's your turn. Fess up and admit that you're wrong.
 

Damn. Owned. I really wouldn't have thought they would do that. That seems so strange. I wonder if it's cheaper to draw than to do the visibility determination on. Odd.

Conceded. lol

I've been learning visibility determination algorithms and I extrapolated knowledge from that to Crysis 2. Clearly, foot in mouth.
Now it's your turn. Fess up and admit that you're wrong.

Done. Not too proud to admit when I'm wrong.
 
It's not really fair to compare it to PC games, next-gen consoles won't compare favorably either unless they are significantly more expensive than the market will want to pay, or heavily subsidized like that one rumor suggests.


What a rumor? Link?


By the way: The problem with the PS3 was you couldn`t see the $200+ difference. In fact the multiplats looked even worse on that "high end" machine.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Of course you wouldn't, it doesn't suit your opinion. Let's all ignore the CEO of the company coming out and saying 100% it will be selling at a loss at launch, let's make up my own idea to suit myself ignoring the facts!

And you believe everything that any executives says do you?
 

Triple U

Banned
It's relevant when the PS3 was selling as bad as it was at that high price (Remember how hard the GBA pushed it for a while?), and didn't start building momentum until the price came down (Remember how many 'Year of the PS3!' comments we got, one like every year). There's a reason why this gen has been longer than the average one, and the price was a big one.

No, its not. How the console was performing at $600 had no bearing on that post. Really don't know how to spell it out anymore.
 

gatti-man

Member
This all just justifies my decision to wait. For me, this was always going to be nothing but a Nintendo Games Box ™

I won't pay launch money for that, especially when the games aren't there yet. But 2-3 years from now I'll be glad to jump in and play the Nintendo games in HD. It's a shame the console can't be much more than that in this respect, but what can you do. I was going to get the PS4 anyway (barring huge fuck up). In the end though, Nintendo being Nintendo. No one should be surprised. Also I don't think I agree with people trying to downplay the likely difference between wiiU/PS4 and 720. Yeah it won't be something like HD vs non HD of this gen. But it'll potentially be fucking huge. And never underestimate developers. Just because engines are scalable doesn't mean they will, and if they do it doesn't mean they'll be remotely good compares to the rest. That's assuming 3rd party games sell a lot on wiiU to begin with.

Lots of interesting times ahead. The next year/year and a half are going to be hilarious/scary/fascinating.

I agree. I buy my Nintendo consoles every other gen. The wiiU is a really cool piece of kit but it is a Nintendo console that's going to be best with Nintendo software only.
 

chaosblade

Unconfirmed Member
What a rumor? Link?

Don't have a specific link since I've seen it in practically every next-gen thread here, talking about selling the next Xbox at $100 or $200 with a contract/subscription, which would subsidize the cost of the console over the course of the generation allowing Microsoft to put very powerful, otherwise unaffordable(/unsellable) hardware in the box.
 

beril

Member
And, for instance, the CPU can access the XDR at the same time the GPU is accessing the GDDR3? Yeah, I see your point now. Forgot that there were actually benefits to the split pool approach.

You should still be able to to do that with unified memory, since's there still multiple chips. PS3 has more chips though. But I'd assume you only get the full benefit if you carefully split up the memory so you're reading from all of them at once.
 
They don't pay for licensing fees,No optical out,no Ethernet,cheap ram..so unless there is something really expensive inside the Gamepad(the only possibility left I think)then I really don't see how they are selling this at a loss for $300/$350
Nintendo knows more about their internal developmrnt than any of us
 

Ydahs

Member
What about the geometry shader? If the WiiU is Directx10.1 hardware that means its geometry shader won't be as good as the one in the next PS360 systems.
Was there a big advancement on the hardware level regarding the geometry shaders between DX10.1 and DX11? I'm not up to scratch with the D3D11 API, but a weaker geometry shader isn't exactly a deal breaker.

WiiU games will undoubtedly look noticeably worse, but the main thing is they'll be supported by next gen engines.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Damn. Owned. I really wouldn't have thought they would do that. That seems so strange. I would guess that it's probably cheaper to draw than to do the visibility determination on it. Odd.

I remember reading an article where a PC dev said it was quicker to let the GPU overdraw rather then complete a geometry culling pass on the CPU.

On the other hand. Killzone 3 offloaded a geometry culling pass to one of the SPU's to reduce as much as possible the burden of drawing geometry on the RSX.

It depends how powerful the CPU/GPU ratio is to determine the approach.
 

Donnie

Member
They've researched the part numbers that were on the RAM chips and it was shown that they each have a 16-bit wide interface. Basically, you multiply that x4 because there are that many chips and that is your max bus width.

Oh I understand the 16x4 chips = 64bit bus. I was just asking for clarification on what exactly 256Mx16 means? I now see were its mentioned on the Samsung PDF, was just wondering as none of the chips are 256M.. They're either 4Gb chips or 512MB chips (both the same thing one in bits and the other in bytes)
 

Portugeezer

Gold Member
With a system update? What about if someone never connects to the internet? Can they still play the game?

Don't devs have to cater to the lowest common denominator?

I suspect there are alternate update methods.

But this is Nintendo, did they cater to the lowest common denominator when Skyward Sword required Wii Motion+?
 

Reiko

Banned
Was there a big advancement on the hardware level regarding the geometry shaders between DX10.1 and DX11? I'm not up to scratch with the D3D11 API, but a weaker geometry shader isn't exactly a deal breaker.

WiiU games will undoubtedly look noticeably worse, but the main thing is they'll be supported by next gen engines.

For Crysis 3, there's a crap ton of effects that are tied to the Direct X11 API. I mean the alpha down right refuses to let you play on DX10.

I'm expecting a complete downgrade on the PS3/360.

If the Wii U can fit somewhere in between I'll be impressed.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
With a system update? What about if someone never connects to the internet? Can they still play the game?

Don't devs have to cater to the lowest common denominator?

Newer games might have a mandatory OS upgrade on the disk.
 

Donnie

Member
And, for instance, the CPU can access the XDR at the same time the GPU is accessing the GDDR3? Yeah, I see your point now. Forgot that there were actually benefits to the split pool approach.

Yeah there are definitely benefits, but as 360 shows even a rather limited implementation of eDRAM has equally big advantages. That's why my view has always been that WiiU would have something like 20GB or so main memory bandwidth, but 17GB is a bit lower than I expected obviously.
 

Thraktor

Member
Love this logic.

It's perfectly valid logic. The only next-gen engine we have a decent amount of info about the workings of is UE4, where the main innovation is a lighting system based on SVOGI (sparse voxel octree global illumination). The grunt work of SVOGI consists of cone traces performed on the GPU, which requires GPGPU capabilities, which means the PS3 and XBox360 simply couldn't perform this technique. The Wii U's GPU can, although of course we don't know how limited the quality would end up being.
 

charsace

Member
Was there a big advancement on the hardware level regarding the geometry shaders between DX10.1 and DX11? I'm not up to scratch with the D3D11 API, but a weaker geometry shader isn't exactly a deal breaker.

WiiU games will undoubtedly look noticeably worse, but the main thing is they'll be supported by next gen engines.

DirectX 11.1 gpu's will have geometry shader hardware that more efficient.
 
20gm5c.gif


Seriously Nintendo...... We want this game!

If the next LOZ looks like that demo then I am pleased and could care less about the hardware.
 

Thraktor

Member
Oh I understand the 16x4 chips = 64bit bus. I was just asking for clarification on what exactly 256Mx16 means? I now see were its mentioned on the Samsung PDF, was just wondering as none of the chips are 256M.. They're either 4Gb chips or 512MB chips (both the same thing one in bits and the other in bytes)

Each package contains 16 internal chips, each of which is 256Mb (so 256 x 16 = 4096Mb per package).
 

Jonm1010

Banned
This all just justifies my decision to wait. For me, this was always going to be nothing but a Nintendo Games Box ™

I won't pay launch money for that, especially when the games aren't there yet. But 2-3 years from now I'll be glad to jump in and play the Nintendo games in HD. It's a shame the console can't be much more than that in this respect, but what can you do. I was going to get the PS4 anyway (barring huge fuck up). In the end though, Nintendo being Nintendo. No one should be surprised. Also I don't think I agree with people trying to downplay the likely difference between wiiU/PS4 and 720. Yeah it won't be something like HD vs non HD of this gen. But it'll potentially be fucking huge. And never underestimate developers. Just because engines are scalable doesn't mean they will, and if they do it doesn't mean they'll be remotely good compares to the rest. That's assuming 3rd party games sell a lot on wiiU to begin with.

Lots of interesting times ahead. The next year/year and a half are going to be hilarious/scary/fascinating.

Pretty much this. I just bought an x51 to get into PC gaming and this sort of stuff is making my decisions feel much better right now.

I'll probably pick one up in a couple years if Nintendo has some good games this round.

However I feel Nintendo is about to run into the same problem with wiiU as the wii with third parties. Developers are gonna see the underwhelming specs and early on either shun or provide only lukewarm support. Especially once the other next gen consoles roll around next year. Nintendo owners will see the lack of quality third party games and not buy them at a high clip and developers will respond with even less support. Creating another vicious cycle for Nintendo with third parties.

Though my thoughts on third parties more revolves around western studios, seeing as Nintendo has secured some solid Japanese third party support.
 

apana

Member
If they aren't making a profit I can't be too angry about it. I like the idea of console companies going in different directions. The graphics don't really make a difference to me, I wouldn't have bought it for 2-3 years regardless of how powerful it was.
 

Donnie

Member
There are four modules on the Wii U's motherboard, each of which have a 16-pin data interface.

I know that's whats being said but what I'm trying to pin down is where is the info coming from that each interface is 16bit? Nobody's said they've found that on the board AFAICS, in the op it says:

256Mx16 would imply 16-bit I/O per DRAM

I'm just wondering how the 256x16 bit implies 16bit I/O per DRAM? Obviously it being an implication means its being assumed based on the RAM configuration of 256MBx16, and I'm not sure how that works. RAM bus configurations aren't my strong point.

I'm not trying to say anyones wrong by the way, I just like to understand things :D
 

defferoo

Member
They should. The OS is currently bloated.

question is, will they? given how crappy their apparent memory management is, I would say they don't have much flexibility in that ballpark... especially since the internet browser is designed to run in parallel with games. Now if they could utilize a swap and page memory between RAM and flash, then we might be on to something.

it's seriously mindboggling, one system app runs at a time and it's given the entire 1GB of system RAM to work with? this seems like what's happening anyway.. given that loading up the system settings menu takes 15-20 secs... I mean, even if the system settings binary was 100 MB (which it really shouldn't be), it shouldn't take that long to load the entire thing from flash memory to RAM and start the application.
 
Top Bottom