• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Donnie

Member
Durango will purportedly have 5-7x the quantity of RAM, and even using DDR3 it's expected that it will be ~5x the bandwidth, alongside 32MB of fast ESRAM. How comparable are the situations really?

It will reportedly have 4x as much RAM, we'll see how much RAM it has for games in comparison to Wiiu once.its released. At most it looks like 5x but WiiU could easily have more than 1GB for games by then. Also what's this talk of embedded SRAM? Why on earth would you use SRAM for a large pool of memory inside a GPU?... ???
 

Donnie

Member
I won't claim to have a full grasping of the situation here, but I believe the most pressing matter at hand is keeping the GPU fed with texture data. In that area, Xenos has access to the full ~22 GB/s of bandwidth.

You can't read 22GB/s of textures from 360's RAM, nor would you ever need to though. With only 480MB usable RAM it would be utterly unnecessary (even with 1GB RAM it would be quite pointless).
 
The Wii U reserves 1GB of ram for OS leaving 1GB for games. The 360 uses 32MB for OS apparently, leaving 480MB for games. So there's only a 2x increases there.

We also know that the WiiU's ram is 12.8GB/s, vs 22.4GB/s on 360. So half as fast here.

We also know that WiiU's CPU is single threaded, and that 360's are dual threaded. We also know that Metro 2033 devs think the CPU is "horrible, slow".

There are a lot of things we know and a lot of things we don't know.

Didn't he or the company recently take this back saying they were working with old devkits or something?

Beaten like a stubborn mule.
 

Rolf NB

Member
Eh, it was never a debate. We knew since 2011 the Wii U was next gen.
People arguing otherwise are just doing it for their own times sake.
What does this even mean? And why would people care enough to have arguments about it?

Stock 4770s are also running at a higher clock and (probably) voltage. Think about that for a moment. Latte is bigger and therefore most likely has quite a few more transistors. We actually know that because we've seen the die. Again, I'm not saying it's 600GFLOPS or anything, but just looking at the wattage doesn't really tell us all that much.
Latte includes the eDRAM.
B3D sez RV740 is 138mm².
RWT sez 1MBit of 45nm IBM eDRAM is 0.24mm², giving us 61.44mm² for 32MByte. So it's not just clocks. The logic portion (chips sans eDRAM) in Latte actually is significantly smaller than RV740.
RWT said:
The 1Mbit eDRAM macros used in the POWER7 are 0.24mm2 with a 1.05V supply and 1.7ns/1.35 cycle and access times.
 
What does this even mean? And why would people care enough to have arguments about it?

Latte includes the eDRAM.
B3D sez RV740 is 138mm².
RWT sez 1MBit of 45nm IBM eDRAM is 0.24mm², giving us 61.44mm² for 32MByte. So it's not just clocks. The logic portion (chips sans eDRAM) in Latte actually is significantly smaller than RV740.



IIRC you'd have to remove some things for RV740 die size, like the GDDR5 memory controller and also the PCI-express interface.

Here's an exemple with HD4870:
slides09.jpg
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
What does this even mean? And why would people care enough to have arguments about it?

Latte includes the eDRAM.
B3D sez RV740 is 138mm².
RWT sez 1MBit of 45nm IBM eDRAM is 0.24mm², giving us 61.44mm² for 32MByte. So it's not just clocks. The logic portion (chips sans eDRAM) in Latte actually is significantly smaller than RV740.
That's a very rough estimate. Aside from what GostTrick brought up, chances are high that the GPU edram is NEC's own UX8GD/UX8LD 40nm tech.

Apropos,

Compute starved. They more than tripled aggregate main memory bandwidth and more than doubled its size -- or if you take into account that Gamecube ARAM was too slow for any meaningful working set, unlike in Wii, you might as well say they scaled it by 3.6 times. OTOH GPU and CPU performance scaled only by 50%. Plus framebuffer size restrictions.

Based on the assumption that we regard Gamecube as a balanced, efficient design, we simply have to observe how unevenly the components in Wii were scaled up.
A hypothetical imbalance in BW/compute does not mean that the system as a whole is compute-starved. A compute-starved system would imply that a certain quantity of outputs (say, ROPs) are not backed up by an adequate amount of ALUs. That was not the case with Wii. Whether it had a spare BW resource or not cannot be said unless you profiled an exemplary BW-heavy, e.g. EMBM-rich, scenario. I think you're reaching a bit here.
 

Ryoku

Member
That's actually the problem for a lot of Wii U things. We don't know if the GPU is 40nm or 55nm, if eDram is 45nm, or below.

I think everything points to either 40nm or less. Don't quote me on it, but wasn't it stated [almost] two years ago that it was 40nm? Or was that the CPU?
 
I think everything points to either 40nm or less. Don't quote me on it, but wasn't it stated [almost] two years ago that it was 40nm? Or was that the CPU?



Well, tbh, I'm not working with rumors, or people said, but actual facts. Wii U's GPU die size is a fact for exemple. Now, eDram die size and the actual GPU size is unknown, unless someone take X ray shots, or find the process (40nm or 55nm).
I also don't know if Wii U is more powerfull than PS3/360, on par or less powerfull. I'm just working with facts.
 

Ryoku

Member
Well, tbh, I'm not working with rumors, or people said, but actual facts. Wii U's GPU die size is a fact for exemple. Now, eDram die size and the actual GPU size is unknown, unless someone take X ray shots, or find the process (40nm or 55nm).
I also don't know if Wii U is more powerfull than PS3/360, on par or less powerfull. I'm just working with facts.

This was what I was thinking of. Pardon me, I'm tired as hell :(
 
That's actually the problem for a lot of Wii U things. We don't know if the GPU is 40nm or 55nm, if eDram is 45nm, or below.


The eDRAM is on the same DIE as the GPU, so we do know that the GPU and the eDRAM use the same process. And that process is either 55nm (very, very unlikely imo) or 40nm. It's definitely not 45nm.
 
Could the GPU be even higher than 500-600 gflops if it was based on the mobile 4830? At 40nm it was something like 26 gflops per watt.

Assuming the gpu is 25-30 watts: 26 x 30 = 780 gflops.
 
My Wifi drops everytime I start up my wii u. then it ask me to check my connection and enter my WPA2 password. Isnt the password saved? Because everytime I enter it works. But it is pretty damn annoying to do this everytime I start up my Wii U.

What gives?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
My Wifi drops everytime I start up my wii u. then it ask me to check my connection and enter my WPA2 password. Isnt the password saved? Because everytime I enter it works. But it is pretty damn annoying to do this everytime I start up my Wii U.

What gives?
Have you tried contacting customer support? BTW, this is not the thread for such questions. Try some of the impression threads.
 

chaosblade

Unconfirmed Member
Could the GPU be even higher than 500-600 gflops if it was based on the mobile 4830? At 40nm it was something like 26 gflops per watt.

Assuming the gpu is 25-30 watts: 26 x 30 = 780 gflops.

The entire system is 32W under load, so you're aiming a little high there.

Not that it's an entirely accurate comparison anyway.
 

ozfunghi

Member
Could the GPU be even higher than 500-600 gflops if it was based on the mobile 4830? At 40nm it was something like 26 gflops per watt.

Assuming the gpu is 25-30 watts: 26 x 30 = 780 gflops.

GPU runs at 550 Mhz, I think 480 SPU's is about the maximum to realistically expect. That would mean 528 Gflops. If it's only ~320 SPU's, that number drops to 350 Gflops. I would think it's somewhere in between.
 

tipoo

Banned
That's not really a fair comparison, since one is a brand new console getting rushed ports and the other is years old with mature tools and lots of dev experience.



That "compute module" would be for things like physics, which would help out with graphics just because the GPU wouldn't have to be used for tasks offloaded from the CPU.



Even day one games for 360 and PS3 demonstrated that they could do significantly more than the previous generation. Yes things got even better over time, but even at launch you could tell they were more powerful. With the Wii U, can you point to anything and say the previous gen could not do that? Even without reworking their whole toolchain, if it was significantly more powerful devs could at least use better AA and higher AF and motion blurring methods, but on the contrary, they seem to be inferior in places (ok ok Trine 2, but that's a sample size of one).

At the very least we can safely say if it is more powerful, it's not enough to overcome initial optimization issues, which means not a large jump if there is one.
 

tipoo

Banned
Unlockable power dude. Unlockable power.. ;)

It has a 70 watt power brick and it draws 33 watts. That's fairly typical for systems to draw about half what their power supply can (with a 250 watt power brick the Falcon 360 drew 120 watts for instance, similar with the PS3). No, I don't think they will be clocking anything higher than they already do. What would be the point in waiting? They already create bad press over its power, why not offer the full power on day one if it can be "unlocked"?

Mighty Switch Force HD, Scribblenauts Unlimited, Monster Hunter 3 Ultimate and some others are all native 1080p.

All games with pretty simple visuals if I'm not mistaken, and I think some PS3 and 360 games were 1080P as well? In fact didn't the PS3 originally have a rule for 1080p only games which they dropped?
 

THE:MILKMAN

Member
Can anyone explain why WiiU seems to limit power usage to 32/33W no matter how graphically intensive the game? ie: Nintendoland vs AC3.

Is it a Nintendo imposed limit due to the size of the box/cooling system?
 

ozfunghi

Member
Even day one games for 360 and PS3 demonstrated that they could do significantly more than the previous generation. Yes things got even better over time, but even at launch you could tell they were more powerful. With the Wii U, can you point to anything and say the previous gen could not do that? Even without reworking their whole toolchain, if it was significantly more powerful devs could at least use better AA and higher AF and motion blurring methods, but on the contrary, they seem to be inferior in places (ok ok Trine 2, but that's a sample size of one).

What does that have to do with anything? Seriously. PS360 were 8-10 times more powerful than the previous generation. Ofcourse launch games would look better, going HD alone would take care of that. WiiU is NOT 8-10 times as powerful as PS360, nobody has ever claimed this. The gap is much smaller, let's say for arguments sake, it is twice as powerful overall. That is not enough to show launch games looking better than games made for PS360 where developers have had 7 years to get to know the hardware and exploit every loophole they could find.

Your comparison makes no sense. It's like putting Sebastian Vettel in a Formula 3000 car, when you yourself get to ride the F1 redbull. Do you think you would have a chance beating him? And would that mean the F1 car is inferior to the F3000 car?
 

tipoo

Banned
I think everything points to either 40nm or less. Don't quote me on it, but wasn't it stated [almost] two years ago that it was 40nm? Or was that the CPU?

The CPU is being built at IBMs 45nm plants. I thought the GPU was 40nm too but I can't find a source on that, but we know the CPU fab process for sure.
 

tipoo

Banned
What does that have to do with anything? Seriously. PS360 were 8-10 times more powerful than the previous generation. Ofcourse launch games would look better, going HD alone would take care of that. WiiU is NOT 8-10 times as powerful as PS360, nobody has ever claimed this. The gap is much smaller, let's say for arguments sake, it is twice as powerful overall. That is not enough to show launch games looking better than games made for PS360 where developers have had 7 years to get to know the hardware and exploit every loophole they could find.

Your comparison makes no sense. It's like putting Sebastian Vettel in a Formula 3000 car, when you yourself get to ride the F1 redbull. Do you think you would have a chance beating him? And would that mean the F1 car is inferior to the F3000 car?

No, that's exactly the point I was trying to make. I'm not saying it's less powerful, just that if it is more powerful, it's obviously not hugely so.


Can anyone explain why WiiU seems to limit power usage to 32/33W no matter how graphically intensive the game? ie: Nintendoland vs AC3.

Is it a Nintendo imposed limit due to the size of the box/cooling system?


See my other post, a system drawing about half of its max rated power is pretty typical. I think that's just how it is, not an artificial cap.
 
GPU runs at 550 Mhz, I think 480 SPU's is about the maximum to realistically expect. That would mean 528 Gflops. If it's only ~320 SPU's, that number drops to 350 Gflops. I would think it's somewhere in between.


I'm still not sure how people think 320 SPs is even possible. WiiU's GPU DIE is ~156mm², minus sth. like 35-45mm² for the eDRAM that leaves atleast 110mm² for the GPU. Now Redwood is 40mm² and 104mm² and has 400 SPs. Given that how could WiiU's GPU possible have much less (Turks has 480 SPs with 118mm² just btw.)?
 

ozfunghi

Member
No, that's exactly the point I was trying to make. I'm not saying it's less powerful, just that if it is more powerful, it's obviously not hugely so.

The thing is, we have "some" concrete numbers and facts, but obviously not enough to come to a conclusion. We can gather "some" information based on how launch games look (and suffer), but again, that is merely enough to speculate. Either we'll have to wait and see how games turn out in the future, or we'll need a complete specsheet to know how things stand.

I'm still not sure how people think 320 SPs is even possible. WiiU's GPU DIE is ~156mm², minus sth. like 35-45mm² for the eDRAM that leaves atleast 110mm² for the GPU. Now Redwood is 40mm² and 104mm² and has 400 SPs. Given that how could WiiU's GPU possible have much less (Turks has 480 SPs with 118mm² just btw.)?

I'm not claiming it's only 320 SPU's. In fact i've always assumed 480 SPU's @ 480 Mhz (460 Gflops) would be a realistic expectation, but the GPU is already known to be faster. I don't know if that die space could be taken up by something else. But if the GPU was really 600 Gflops or more, certain launch games shouldn't suffer as much as they did i would assume. But i'm not a tech-expert by any means.
 

THE:MILKMAN

Member
See my other post, a system drawing about half of its max rated power is pretty typical. I think that's just how it is, not an artificial cap.

Oh I know all about that. Even Sony describe the super slim as having 190W power consumption!

My question was about how it's possible that AC3 consumes the same power as Nintendoland. This isn't the case with my games on PS3. ie: Killzone 2 uses a lot more power than Pro Evolution Soccer.

I'm just curious about all the games tested in a DF article all used 32/33W with basically zero variance.
 

tipoo

Banned
Oh I know all about that. Even Sony describe the super slim as having 190W power consumption!

My question was about how it's possible that AC3 consumes the same power as Nintendoland. This isn't the case with my games on PS3. ie: Killzone 2 uses a lot more power than Pro Evolution Soccer.

I'm just curious about all the games tested in a DF article all used 32/33W with basically zero variance.



I'm guessing it's because the CPU and GPU ramp up to max clocks for either game. The PS3 and 360 have more speeds to step up/down to I guess, with the Wii U clocked at 1.2GHz and it running so cool it may as well hit the max speed at all times while a game is running.

Actually, no consoles are really great at managing power consumption. If I remember right the PS3 or 360 would stay at 80 watts or something just watching the dashboard, while any modern PC would ramp the CPU and GPU way down.
 

Mr Swine

Banned
Hi guys, simple question abort the Wii U gamepad.

Is it possible for game developers to make the screen refresh like 1 frame per second or at the lowest possible to free up more power for the graphics?
 

ozfunghi

Member
Hi guys, simple question abort the Wii U gamepad.

Is it possible for game developers to make the screen refresh like 1 frame per second or at the lowest possible to free up more power for the graphics?

Well, all i can say is that i had the impression that the screen in ZombiU sometimes "lagged", as in, was not refreshed immediately. Maybe it is already being done? I have no idea what, if any at all, the benefits would be.
 

Zornica

Banned
Even day one games for 360 and PS3 demonstrated that they could do significantly more than the previous generation. Yes things got even better over time, but even at launch you could tell they were more powerful. With the Wii U, can you point to anything and say the previous gen could not do that? [...]

not if they were ports - like almost all wiiu games out there.

I think we can all agree on the fact that the 360 was way advanced compared to the original xbox. so how do explain this then?:
http://www.gamespot.com/features/xbox-vs-xbox-360-do-you-really-need-hd-6140621/

Most Wii games nowadays look better than those early 360 games. No matter how powerfull the hardware, if the source material is shit, it usually stays that way.


[...]All games with pretty simple visuals if I'm not mistaken, and I think some PS3 and 360 games were 1080P as well? In fact didn't the PS3 originally have a rule for 1080p only games which they dropped?

that has been proven wrong like 6 years ago.
there are about 12 (!) games running in full 1080p, ALL of them are sportsgames or PS2 Ports/remakes like the gow collection or ICO/sotc. Most of the more demanding ps3 games don't run even in full 720p. Especially if they are multiplattform games.
http://forum.beyond3d.com/showthread.php?t=46241
at least late-gen 360 Ports run on par on WiiU - even with it's supposedly weak hardware.



Edit: regarding the 1 gig OS. I'd be surprised if it stayed that way forever. At launch, the 3ds had a whole core locked away and dedicated to the OS. Sometime later they saw that this wasn't necessary so they removed that restriction.
 

Metazoid

Banned
.

Edit: regarding the 1 gig OS. I'd be surprised if it stayed that way forever. At launch, the 3ds had a whole core locked away and dedicated to the OS. Sometime later they saw that this wasn't necessary so they removed that restriction.

I don't see why they would need a whole gig for that OS anyways, unless it's required for the multitasking you can do.
 

Oblivion

Fetishing muscular manly men in skintight hosery
Okay, there seems to be a lot of different kinds of ATi/AMD graphics cards, so it gets pretty confusing for some non-tech savvy folk like myself. From what I've seen, there's:

-RXXX series (R700/R600/Etc.)
-RV-XXX series (RV700/etc.)
-X-YYYY series (X1800/X1900/etc.)
-HD-XXXX series (HD-4000/HD-5000/etc.)

Can someone explain the difference between these four?
 

tipoo

Banned
Hi guys, simple question abort the Wii U gamepad.

Is it possible for game developers to make the screen refresh like 1 frame per second or at the lowest possible to free up more power for the graphics?


Err, if something is sufficiently demanding then the framerate could slow down to one frame a second on its own, the developer doesn't set the frame rate. But what kind of game could you comfortably play at that rate which would require such fancy graphics? Think about it, one frame a second is like a slideshow.
 
Okay, there seems to be a lot of different kinds of ATi/AMD graphics cards, so it gets pretty confusing for some non-tech savvy folk like myself. From what I've seen, there's:

-RXXX series (R700/R600/Etc.)
-RV-XXX series (RV700/etc.)
-X-YYYY series (X1800/X1900/etc.)
-HD-XXXX series (HD-4000/HD-5000/etc.)

Can someone explain the difference between these four?

http://en.wikipedia.org/wiki/Radeon

You mix up card names and chip names.
 

tipoo

Banned
Okay, there seems to be a lot of different kinds of ATi/AMD graphics cards, so it gets pretty confusing for some non-tech savvy folk like myself. From what I've seen, there's:

-RXXX series (R700/R600/Etc.)
-RV-XXX series (RV700/etc.)
-X-YYYY series (X1800/X1900/etc.)
-HD-XXXX series (HD-4000/HD-5000/etc.)

Can someone explain the difference between these four?



I think you are confusing code names for graphics card names. The R700 series is the Radeon HD 4000 series at retail for instance. Anything with the R in front is a code name, you can figure out what the retail card name is through wikipedia.

You don't need to worry about the RV names, they point to the same family as the R names.

The HD series is just a moniker they added on since the 2000 series. That was their change to unified shaders, so you could say the HD name signals that. Before that the x1000 series (x1900, etc) were fixed function shaders.
 

ozfunghi

Member
Okay, there seems to be a lot of different kinds of ATi/AMD graphics cards, so it gets pretty confusing for some non-tech savvy folk like myself. From what I've seen, there's:

-RXXX series (R700/R600/Etc.)
-RV-XXX series (RV700/etc.)
-X-YYYY series (X1800/X1900/etc.)
-HD-XXXX series (HD-4000/HD-5000/etc.)

Can someone explain the difference between these four?

R/RV are the chips you will find on the X/HD graphics cards. When people talk about HD4xxx inside the WiiU, they actually mean the corresponding chip on that card.
 

Oblivion

Fetishing muscular manly men in skintight hosery
http://en.wikipedia.org/wiki/Radeon

You mix up card names and chip names.

I think you are confusing code names for graphics card names. The R700 series is the Radeon HD 4000 series at retail for instance. Anything with the R in front is a code name, you can figure out what the retail card name is through wikipedia.

You don't need to worry about the RV names, they point to the same family as the R names.

The HD series is just a moniker they added on since the 2000 series. That was their change to unified shaders, so you could say the HD name signals that. Before that the x1000 series (x1900, etc) were fixed function shaders.

R/RV are the chips you will find on the X/HD graphics cards. When people talk about HD4xxx inside the WiiU, they actually mean the corresponding chip on that card.

Ah, okay now it seems a lot clearer. Danke!
 

JordanN

Banned
What does this even mean? And why would people care enough to have arguments about it?
As in if you think Wii U is really voided of any information saying it's more powerful (thus bringing up a debate) it's not true. I then further backed this by saying we've known this since 2011. It may sound weird but it's the only way I can put it.

Oh, and I only say this because usually when I see "debate", it's usually when someone tries to bring legitimacy to where there's an actual answer. Like say if I wanted to debate "PS3 is weaker than Dreamcast". Just saying "PS3 is weaker" isn't enough to make the situation actually look "pathetic".
 

JordanN

Banned
Sorry for the double post, but I had to get this off my chest.

After browsing the "games known for their visuals" thread someone reminded me of the conduit for Wii. For those that don't know, The Conduit was suppose to be a game promising Xbox 360 graphics on the Wii. They even made a video showcasing what their engine could do.
https://www.youtube.com/watch?v=5tovnipDToc

It obviously never matched the 360 1:1 but it was an impressive effort given the API they had to work with.

Now imagine if a developer went through the same effort for Wii U, trying to bring PS4/720 visuals to the console. It's unlikely to be 1:1 but with 3 multicore OOE processors, 1GB of RAM and a 2012 GPU, the difference between Wii U and PS4/720 should be far negligible this time.
 
Top Bottom