• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Ryoku

Member
Goodness. What the fuck happened here?

Time to get back at probably the most mysterious aspect of the Wii U.

What do you guys suspect will be the power output of the GPU? Using PC graphics card as an estimate, what are your conservative, optimistic, and most likely cases?
 

Kimawolf

Member
On that note, AFAIK the only developer who has come out and said they had no issues with memory bandwidth was the developer for either Trine 2 or Nano Assault. I can't recall which one, but it was certainly one of the two.

Either way neither developer holds any real weight. They're not big name developers making complete 3D games and worlds. Rather they make small basic e-store indy titles. Trine 2 certainly isn't a graphically advanced game, nor is Nano Assault. So the fact they didn't have bandwidth issues or CPU issues means nothing.

Can you provide a source where a big name developer has said they've had no issues with the memory bandwidth of the Wii U? By big name i mean a 3rd party developer or publisher who make complex 3D games such as COD, Mass Effect, GTA, a Capcom, EA, Ubisoft, etc.



As per above.

Also given how bad the Trine 2 port on PS3 was, having AA on the menu and text, i wouldn't rate them very highly or competely at all. They're cool and made a great game, but they're not technical wizards.

Wow.. I been reading this thread since it began, and i have to say, it has fallen far.. FAR from what it was intended. this is the issue. How can you discount developers who've made games for the system cause it doesn't meet your narrative or viewpoint? Would it help if the Aliens guys said it again perhaps or would you not trust them either? no one is arguing it's more powerful, or hell close to power as the other two, but to just write off Developers cause they don't fit your narrative is intellectually dishonest at BEST. We get it, Wii U doesn't have "magic parts and algorithms" to make it uber powerful... but comments and crap like this have completely ruined the conversation that was going on from posters like Fourth Storm, Blu, etc.

Can we please get back to the interesting discussion on the various components and how they work?
 
Either way neither developer holds any real weight. They're not big name developers making complete 3D games and worlds. Rather they make small basic e-store indy titles. Trine 2 certainly isn't a graphically advanced game, nor is Nano Assault. So the fact they didn't have bandwidth issues or CPU issues means nothing.
Haha, I literally called this kind of post a few minutes before it happened

Also given how bad the Trine 2 port on PS3 was, having AA on the menu and text, i wouldn't rate them very highly or competely at all. They're cool and made a great game, but they're not technical wizards.
Going by that logic we should discount the numerous devs that have had shoddy PS3 ports, which includes a good number of the larger ones
 

tipoo

Banned
You won't have 8 cores for games, how many you will have is arguable (rumours were saying 6) but it certainly won't be 8. Also WiiU has 3 cores plus Audio DSP.

Which assumes both next gen consoles will not have a DSP, which we don't know. And also that DSP processing would be any substantial burden for modern CPU cores, on PC CPU Audio is already negligible.

True, if two cores on Durango are OS reserved the difference would be less, but it's still double the cores at 1.33x the clock rate. Even if the Wii U CPU does substantially higher instructions per clock (which I don't know that it does), that's pretty hard to overcome. And for the PS4 I have not heard of the OS taking up any cores.
 
Goodness. What the fuck happened here?

Time to get back at probably the most mysterious aspect of the Wii U.

What do you guys suspect will be the power output of the GPU? Using PC graphics card as an estimate, what are your conservative, optimistic, and most likely cases?
Can we please get back to the interesting discussion on the various components and how they work?
What new news exactly has come about that would provide any more insight?

We have the same comments trotted out fro Shin'en for the millionth time. Yay?

This thread was bumped after the spec leaks for Durangorbis rather than anything new about the Wii U.
 

Kimawolf

Member
What new news exactly has come about that would provide any more insight?

This thread was bumped after the spec leaks for Durangorbis rather than anything new about the Wii U.

Hell I don't know, I was enjoying reading about all the components and their theories on how it all worked together. I like that kind of stuff. and to be fair, the same kind of crap is going on in the Durango thread as well, but there it gets drowned out. but as you've seen it's devolved into a "Wii U is shit/etc etc stop saying its not" discussion. It completed halted the discussion and turned into what we have now. Hell Durante's posts got swallowed up in the nonsense.
 

Ryoku

Member
What new news exactly has come about that would provide any more insight?

We have the same comments trotted out fro Shin'en for the millionth time. Yay?

This thread was bumped after the spec leaks for Durangorbis rather than anything new about the Wii U.

I've actually been following this thread for quite a bit. The recent "shitstorm" began only a page ago, and seems to be detracting quite a bit from what the thread title suggests. Just my petty attempt to gain back ground, but it'll probably recover either way.
 

ikioi

Banned
Wow.. I been reading this thread since it began, and i have to say, it has fallen far.. FAR from what it was intended. this is the issue. How can you discount developers who've made games for the system cause it doesn't meet your narrative or viewpoint?

Not at all.

All i'm doing is brining into question how much weight comments from Shi'en carry.

They're an indy developer and their titles are small budget, simplistic, and small in scale. Just because they've had no bandwidth issues doesn't mean that's indiciative of what other devleopers may experience. Their games are no where near the complexity, size, or scale of a big budget game.

Would it help if the Aliens guys said it again perhaps or would you not trust them either?

In my mind it would carry more weight. Their games size, scale, complexity, budget, all significantly larger then Nano Assault Neo's.

If they say the Wii U' architecture is not providing any bandwidth constraints, i'd start to change my tune.

That said my view is not just formed on what developers say. As per my original post on the last page i've also looked into the Wii U's architecture myself. The eDRAM on the GPU seems be on a slower bus then that of the Xbox 360's. Its unlikely the Wii U's GPU has more then 8 ROPs. The Wii U's MEM2 pool is half the bit rate of the Xbox 360s, and even with a modern memory controller and bidirectional its still likely to be slower.


No one is arguing it's more powerful, or hell close to power as the other two, but to just write off Developers cause they don't fit your narrative is intellectually dishonest at BEST. We get it, Wii U doesn't have "magic parts and algorithms" to make it uber powerful... but comments and crap like this have completely ruined the conversation that was going on from posters like Fourth Storm, Blu, etc.

As per the above i've formed my opinion based on technical discussions, tear downs of the console, and what we've seen to date.

We've seen complex 3D games bog down in areas which are CPU heavy. Such as how the frame rate in BLOPS drops when more players are on the screen. That suggests heavily a CPU bound limitation. We've also seen no Wii U game offer improved AA/AF vs the Xbox 360. We've also seen a reduction in volumetric, shadowing, and lighting effects on Wii U versions of titles vs the Xbox 360, which suggests ROP and bandwidth limitations.

The counter argument to the above is these games are just ports, and most were done on the cheap with as little financial and man power investments as possible. They were also ported on hardware which has a significantly different architecture to the Xbox 360 and PS3. That said, i don't accept this argument. As the slow downs really do appear to be CPU bound, ROP bound, and bandwidth bound.
 
Settle down, gentlemen.

Re: Trine 2 and Nano Assault - they are pretty and the developers should be commended for their fine efforts. However, these type of level based, stylized titles do not have the same requirements as a huge open world adventure. Or any modern FPS which does not take place in a corridor. I am no game programmer and I can see that. To use those two titles as evidence that Wii U's memory bandwidth is sufficient or that its CPU will be up to the task of receiving AAA games for the next 5-6 years is completely ridiculous.

Nano Assault Neo was also built from the ground up on Wii U, but that doesn't mean that its performance is automatically better than if it were a port - merely it was designed with Wii U's limitations in mind. If the team says Wii U's memory caused no problems, it merely says that they had no aspirations to design anything that required more.
 

Donnie

Member
If the current rumours are accurate, a factor of 4-6 in GPU performance (and a newer architecture as well), up to 7 in CPU performance and around 6 in external memory bandwidth. (The latter is 15 instead of 6 for Orbis, but its memory setup is not comparable)

Are you comparing to the rumoured PS4 specs here? Because no way is XBox3's GPU anywhere near 4-6x the raw performance of WiiU's (I doubt PS4's is..). Also no idea how you can come to the conclusion that 8 Jaguar cores at 1.6Ghz could be 7x WiiU triple core CPU (that puts each Jaguar core at nearly 3x the performance of each WiiU CPU core, not a chance). Also worth considering the rumours that 2 of those Jaguar cores will be locked for the OS/Apps and that WiiU has a DSP and ARM CPU to lighten its load.
 

JordanN

Banned
On that note, AFAIK the only developer who has come out and said they had no issues with memory bandwidth was the developer for either Trine 2 or Nano Assault. I can't recall which one, but it was certainly one of the two.

Either way neither developer holds any real weight. They're not big name developers making complete 3D games and worlds. Rather they make small basic e-store indy titles. Trine 2 certainly isn't a graphically advanced game, nor is Nano Assault. So the fact they didn't have bandwidth issues or CPU issues means nothing.

Can you provide a source where a big name developer has said they've had no issues with the memory bandwidth of the Wii U? By big name i mean a 3rd party developer or publisher who make complex 3D games such as COD, Mass Effect, GTA, a Capcom, EA, Ubisoft, etc.
In the same interview, it's assumed they had no problems with the memory at all (including bandwidth).

And about the ram, are other parameters than latency such as bandwidth favorable compare to current platforms?

Shin'en: I’m not able to be precise on that but for us as a developer we see everything works perfectly together.
 
Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.

DX11 brought tessellation and many AA techniques. MLAA would be the big thing AMD was pushing at the time of DX11. Wii U will lack tessellation at least. I think there is also some texture things. I don't really know about the entire DX11 API but there is a difference between hardware than can run it and hardware that can't. OGL will not magically allow you to do things not inside the hardware.

The entire CPU is about the size of an single 45nm atom core. Theres not much say about it except its tiny. The 750 core hasn't been in development at IBM for years except for when nintendo wanted a new console. Im sure nintendo didn't pay IBM as much money as intel sinks into CPU development for us to compare it to a core 2.
 
Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.

Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates
 

JordanN

Banned
What's "pushing high horsepower engines" even supposed to mean?
IW Engine has high horsepower. See, look at the horsies!
9KAJtoi.jpg
 

Donnie

Member
Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.

DX11 brought tessellation and many AA techniques. MLAA would be the big thing AMD was pushing at the time of DX11. Wii U will lack tessellation at least. I think there is also some texture things. I don't really know about the entire DX11 API but there is a difference between hardware than can run it and hardware that can't. OGL will not magically allow you to do things not inside the hardware.

The entire CPU is about the size of an single 45nm atom core. Theres not much say about it except its tiny. The 750 core hasn't been in development at IBM for years except for when nintendo wanted a new console. Im sure nintendo didn't pay IBM as much money as intel sinks into CPU development for us to compare it to a core 2.

I agree with a lot of that, though I don't think we can know what's been added to WiiU's GPU at this point, improved tesselation?, maybe, maybe not. Also as Espresso being tiny, isn't a Jaguar core about the same size (much smaller on its 28nm process).
 
Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates
From the actual chip. Because some person hear someone that said something about something other doesn't make it likely. There is no enough space on the GPU for 600GFLOPs.
 

ikioi

Banned
In the same interview, it's assumed they had no problems with the memory at all (including bandwidth).

As per my previous post, they're a small indy devleoper who made a rather simplistic game. They're also heavily invested into Nintendo given they only make titles for Nintendo's platforms.

It means nothing to me that they had no bandwidth issues with Nano Assault Neo. I own the game and its frankly a very simplistic title. Its graphics aren't great, its got limited maps, limited enemies, it's not by any stretch a resource intensive game. Its by no means a bad game, its fun, but technically it's nothing brilliant or even special.

Get back to me when a big name developer with a big budget complex 3D title says the Wii U's bandwidth is sufficient and they're happy with it.

Every bit of evidence suggests the Wii U's bandwidth is poor, very poor. From the slow eDRAM, to the 64bit MEM2 pool. Then there's the CPU which has an incredibly low TDP and is physically the size of a single Intel atom core. The GPU also appears to only have 8 ROPs as evident by the console's low bandwidth, and how it bogs down in games when specific ROP intense features are used.

Your entire argument seems to center around what a single, indy, Nintendo aligned, developer states. Even then the context is questionable, as he didn't talk specifics.

I'll go on record staying the Wii U is likely less powerful then the Xbox 360. That's my view and i doubt that's going to change. Nintendo will no doubt deliver some super awesome games on the Wii though. They have a reputation for being able to achieve brilliant results on their hardware. But i can't see 3rd parties being able to offer multi platform titles on this system at a visual fidelity higher then the Xbox 360. The system is too gimped and the effort required to have a superior port on the Wii U would be to costly and the reward to minimal.

Yes.
Anyone who says otherwise is uninformed or trying to stir up shit.

You base this on what?
 

Donnie

Member
Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates

I think he's looking at the size of the HD5670 (104mm2) and the transistor count (627 million) and assuming that WiiU's GPU will be of a similar size and amount of transistors when you remove eDRAM and any other extras on there (DSP, ARM CPU perhaps). He's then dropping the clock to 550Mhz which works out at 440Gflops. A bit conservative maybe but overall pretty reasonable.

Personally I think 440Gflops or 528Gflops are the two most likely numbers (unless we take Matts comment about the shader config being very unusual in which case who knows what the exact number could be).
 

ozfunghi

Member
The knee jerk defenders are at least as bad. (Yeah I'm still up)

Not sure if that is supposed to (also) be aimed at me, but you didn't have any problems with a poster stating as fact that the WiiU was in between xbox and xbox360 performancewise a couple pages back and didn't understand why he was being criticized, yet asking for further explanation as to where exactly the touted performance gain comes from in Durango, warrants to be trolled?

So while i don't want to question your technical knowledge, i strongly disagree here. The kneejerk defenders are nowhere near as bad.


Where are you even getting this from? Most estimates peg is around 600GLOPs. It seems like you're ignoring the other GPUs that it's said to be close to and the numerous other estimates

Honestly, after reading an entire years speculation, with what we know about clockspeed, die size, memory size etc... I think you'll need to expect between 352 and 528 Gflps. My guess based on a hunch has always been 460 Gflops, but that was when the clockspeeds of CPU, GPU, DSP... etc, were all assumed to run at a certain multiplier of each other. The GPU already runs faster than i guessed (550 instead of 480 Mhz).
 

ikioi

Banned
Yes.
Anyone who says otherwise is uninformed or trying to stir up shit.


Like what?

The Mass Effect 3 port that runs worse then the Xbox 360 version.

Or the Assasins Creed version?

Or the poor performance and missing effects in Darksiders II

Or the lag and fps drop in BLOPS when more then a couple players are on screen?

Or that NSMB is only 720

What reality are you in? If you've some how built a bridge between our reality and a reality where Nintendo released a more powerful Wii U, help me cross over!
 

Margalis

Banned
Please provide a link when you make such accusations. I certainly don't remember anyone calling eDRAM "A Nintard wishful-thinking setup".

Although nobody used those exact words the idea that the eDRAM could alleviate issues with the slow RAM was brought up and dismissed with extreme prejudice many times.

Then Durango specs leak, it has the same type of slow RAM, same amount of eDRAM even though it may be targeting higher resolution and has more horsepower meaning in essence it has proportionally less eDRAM than the WiiU , less L2 cache per core, not proportionally but in absolute terms, and somehow that setup *does* alleviate issues with slow RAM via....magic?

The lack of consistency is obvious and seems to illustrate that rather than look at specs and draw conclusions some people are more interested in drawing conclusions then justifying them with creaky incoherent arguments.

It certainly doesn't appear to be serious arguments in good faith when one second the story is that eDRAM can't make up for slow main RAM, then two seconds later that story reverses but only for the system that person has been excited for and not for the system they've been down on since announcement.

Edit: In this thread USC-Fan is a perfect example of that. Nintendo added eDRAM as a cost-saving measure because Nintendo are penny-pinchers, MS added eDRAM because that's super efficient and MS engineers are tops! It seems like the architecture decisions make less difference than the logo on the box.
 

JordanN

Banned
As per my previous post, they're a small indy devleoper who made a rather simplistic game. They're also heavily invested into Nintendo given they only make titles for Nintendo's platforms.

I don't think Nano Assault Neo qualifies as simplistic. The setting itself maybe, but the actual game can get chaotic. It was Shin'en who went on record saying thousands of animated objects can appear on screen. In addition, the game can also be displayed on the gamepad and TV simultaneously.
 

Donnie

Member
Which assumes both next gen consoles will not have a DSP, which we don't know. And also that DSP processing would be any substantial burden for modern CPU cores, on PC CPU Audio is already negligible.

True, if two cores on Durango are OS reserved the difference would be less, but it's still double the cores at 1.33x the clock rate. Even if the Wii U CPU does substantially higher instructions per clock (which I don't know that it does), that's pretty hard to overcome. And for the PS4 I have not heard of the OS taking up any cores.

Well we have the full hardware diagram for XBox 3 which describes some pretty small details yet there's no mention of a DSP. Also sound may be described as trivial for modern PC CPU's such as Phenom II and Intel I series, but Jaguar isn't in that league. Wouldn't surprise me if audio could take up as much as a whole Jaguar core or close enough.

Xbox3's Jaguar CPU is clocked 29% faster, but its got a much longer pipeline, 14 stages compared to Espresso's 4, not saying that gives WiiU's CPU the advantage core to core or even makes it 100% as fast (we really can't compare exactly). But it should help to counteract the lower clock speed.
 
While ikioi's post had a negative slant, I don't see how it wasn't technical discussion. Unless the thread is only intended for effusive praise of Nintendo's design choices.
It's political punditry thinly disguised as technically discussion. What information is contained in that post? "game X has less AA, therefore WiiU is inferior". So what, if I do a lazy port of an Android game to windows, than implies my PC is inferior to my phone?

Technical discussion: Data, e.g dye sizes, FLOPs. Opinions from developers on bottlenecks (or lack thereof). etc.

Not technical discussion: screenshot informed conjecture from dudes on the internet.
 
Graphical performance of a card compared for the wii-u, I would say <5670 on GDDR5. 440GFLOP estimates on the gpu is what I'd go with. This will probably translate to 2x the 360's pure GPU performance. The Edram in each system is likely very different. Would be much too hard to say what the effect is without much more data.

DX11 brought tessellation and many AA techniques. MLAA would be the big thing AMD was pushing at the time of DX11. Wii U will lack tessellation at least. I think there is also some texture things. I don't really know about the entire DX11 API but there is a difference between hardware than can run it and hardware that can't. OGL will not magically allow you to do things not inside the hardware.

The entire CPU is about the size of an single 45nm atom core. Theres not much say about it except its tiny. The 750 core hasn't been in development at IBM for years except for when nintendo wanted a new console. Im sure nintendo didn't pay IBM as much money as intel sinks into CPU development for us to compare it to a core 2.
Several things to clarify:

- The Wii U doesn't use direct X. Dx10.1 equivalent features is the baseline, but there are modifications and stuff in GX2 that we don't know any specifications about.

- From a reliable source (and from the leaked document), the Wii U does have a tessellation unit. It is probably based on the one that was in the r700 series that may have never been used outside of a demo. It is unknown how effective it is compared to the olders units that were in older GPUs like the one in the 360.
 
I think he's looking at the size of the HD5670 (104mm2) and the transistor count (627 million) and assuming that WiiU's GPU will be of a similar size and amount of transistors when you remove eDRAM and any other extras on there (DSP, ARM CPU perhaps). He's then dropping the clock to 550Mhz which works out at 440Gflops. A bit conservative maybe but overall pretty reasonable.

Personally I think 440Gflops or 528Gflops are the two most likely numbers (unless we take Matts comment about the shader config being very unusual in which case who knows what the exact number could be).

Was it Matt who stated the config of SPUs:TMUs:ROPs was unusual? I read that over on Beyond3D as well. Thinking about it more, what if the SPUs and TMUs were added in some off ratio for dedicated compute purposes? Follow me here... For the lower end cards in the R700 series, every 40 SPUs is allotted 4 TMUs. Just for example, if they did go with 400 SPUs, what if they simply left off the TMUs for two of the groups (so 32 TMUs instead of 40). Boom! There's your GPGPU!

(It's a dark thought, but what if?)
 

NBtoaster

Member
Although nobody used those exact words the idea that the eDRAM could alleviate issues with the slow RAM was brought up and dismissed with extreme prejudice many times.

Then Durango specs leak, it has the same type of slow RAM, same amount of eDRAM even though it may be targeting higher resolution and has more horsepower meaning in essence it has proportionally less eDRAM than the WiiU , less L2 cache per core, not proportionally but in absolute terms, and somehow that setup *does* alleviate issues with slow RAM via....magic?

They're both DDR3, but that doesn't limit them to the same bandwidth and speed. The rumor says it's 68GB/s, 5x the Wii U bandwidth.
 

USC-fan

Banned
I think he's looking at the size of the HD5670 (104mm2) and the transistor count (627 million) and assuming that WiiU's GPU will be of a similar size and amount of transistors when you remove eDRAM and any other extras on there (DSP, ARM CPU perhaps). He's then dropping the clock to 550Mhz which works out at 440Gflops. A bit conservative maybe but overall pretty reasonable.

Personally I think 440Gflops or 528Gflops are the two most likely numbers (unless we take Matts comment about the shader config being very unusual in which case who knows what the exact number could be).

Isnt this 5670 a 61 TDP card? The whole wuu uses half of that running games. Look at the 5550 is 39 TPD and that even uses more power than the entire wuu.

wuu whole console uses 33 watts running games....

Wuu would be lucky to hit over 350 glfop. The 40nm 5550 uses 39 watts and is at 352 gflops.
 

ozfunghi

Member
Like what?
The Mass Effect 3 port that runs worse then the Xbox 360 version.

Or the Assasins Creed version?

etc

Oh please, this has been covered so much it isn't funny. Xbox360 or PS3 were always lead platform for those games. WiiU games were ports, and in more than a few occasions we know these ports were outsourced, handled by a small team, both time and budget constrained, who didn't know the system like their colleagues who had 7 years to build experience on the HD twins. Furthermore, WiiU has a totally different architecture than either Xbox or PS3, so in order to get the job done within said constraints, corners had to be cut. And yes, under those circumstances, WiiU doesn't have enough grunt to outshine the twins. That doesn't mean that when a game is made from the ground up, with enough resources, by a developer that has got to know the hardware previously, that will still be the case.


They're both DDR3, but that doesn't limit them to the same bandwidth and speed. The rumor says it's 68GB/s, 5x the Wii U bandwidth.

Right, and on the other hand, we have an interview with a developer who basically says he can not believe the numbers that are being brought up for WiiU memory bandwidth. So either the "confirmed" RAM specs aren't all that confirmed, or something else is up.
 

Margalis

Banned
How can you discount developers who've made games for the system cause it doesn't meet your narrative or viewpoint?

The rhetorical trick that certain people like to use is that whenever anyone says something negative about the WiiU those people are to be trusted especially if they aren't making WiiU games and have no horse in that race - even if it means they have no real technical knowledge and are not speaking from experience.

Meanwhile if you have actual technical knowledge of the WiiU and a new dev kit you are making a WiiU game, and therefore it's in your best interest to pimp the game and the system, and therefore nothing positive you say has merit.

It's a pretty clever rhetorical strategy that filters out anything positive by design.
 

ikioi

Banned
Technical discussion: Data, e.g dye sizes, FLOPs. Opinions from developers on bottlenecks (or lack thereof). etc.

Not technical discussion: screenshot informed conjecture from dudes on the internet.

You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.

Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.

To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.

Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.

The rhetorical trick that certain people like to use is that whenever anyone says something negative about the WiiU those people are to be trusted especially if they aren't making WiiU games and have no horse in that race - even if it means they have no real technical knowledge and are not speaking from experience.

There's no smoke without fire. In the case of the Wii U's cpu, the house is well and truely alight We've seen Dice slam it, Crytech slam it, unnamed sources months ago slam it, even developers publically comment on how it was an obstacle they had to work around.

There's also no denying the CPU is based on decade plus old IBM PPC 750 architecture, and has the transistor count of a single Intel atom core. It also has an incredibly low TDP.


Meanwhile if you have actual technical knowledge of the WiiU and a new dev kit you are making a WiiU game, and therefore it's in your best interest to pimp the game and the system, and therefore nothing positive you say has merit.

It's a pretty clever rhetorical strategy that filters out anything positive by design.

Technical knowledge from a small time indy developer who has made one simple small game for the e-store. They also just so happen to only make games for Nintendo hardware.

Yeah totally indicative of the Wii U's performance and an unbais source.

Also you're just as full of the rhetric as anyone else here. You come into this thread criticising people for their arugments and views, yet offer none of your own.
 
Isnt this 5670 a 61 TDP card? The whole wuu uses half of that running games. Look at the 5550 is 39 TPD and that even uses more power than the entire wuu.

wuu whole console uses 33 watts running games....

Wuu would be lucky to hit over 350 glfop. The 40nm 5550 uses 39 watts and is at 352 gflops.

They would likely be using some newer power gating and they clocked the chip much lower than the retail 5670. It more like comparing to the mobile 5750.
 

ozfunghi

Member
No offense, but Shin'en aren't known for pushing modern high horsepower engines. They havent ran into any problems because they're adjusted to working on a much smaller space.

What's "pushing high horsepower engines" even supposed to mean?

Are we supposed to discredit the opinion of a developer because he is used to working on less powerful hardware? So while he can squeeze more out of this hardware, it somehow "doesn't count" because they are used to working on underpowered hardware? As if, it's basically cheating and therefor the results he gets and by extension his opinion, are invalid?
 
You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.

Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.

To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.

Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.

Stop pretending to know what you are talking about... just because to take what other people say and reorganize it before regurgitation doesn't mean you understand it.


afaik, 360's edram was off chip... connected by a very bw limiting bus... somewhere in the xxGB/s range. I'm not completely knowledgeable on this. But then again, I'm not going to act like I am either.
 

Donnie

Member
Isnt this 5670 a 61 TDP card? The whole wuu uses half of that running games. Look at the 5550 is 39 TPD and that even uses more power than the entire wuu.

wuu whole console uses 33 watts running games....

Wuu would be lucky to hit over 350 glfop. The 40nm 5550 uses 39 watts and is at 352 gflops.

Don't know why you're always so focused on direct TDP comparisons. For a start that GPU (the 61w card) is using 2GB of GDDR5, that stuff can easily use around 20 watts by itself. Second the GPU is clocked at 775Mhz, downclock it to 550Mhz (29%) and you'll lower the TDP by a larger percentage then you've lowered the clock speed. Finally this is a custom part, which could easily have some legacy PC features removed in order to save power and also improved power gating.

The idea that a 156mm2 GPU with around 1 billion transistors will be lucky to hit 350Gflops at 550Mhz is quite mad.
 

Margalis

Banned
They're both DDR3, but that doesn't limit them to the same bandwidth and speed. The rumor says it's 68GB/s, 5x the Wii U bandwidth.

If the system has significantly more horsepower, is targeting high res and frame rates, Unreal Engine 4 features and has more RAM overall it better have more bandwidth.

The WiiU has 1 gig available for games and has trouble using that effectively due to the memory bandwidth. (Supposedly) Durango has 5x+ available for games at 5x the bandwidth. Hmm.

ozfunghi said:
Are we supposed to discredit the opinion of a developer because he is used to working on less powerful hardware? So while he can squeeze more out of this hardware, it somehow "doesn't count" because they are used to working on underpowered hardware? As if, it's basically cheating and therefor the results he gets and by extension his opinion, are invalid?

You're supposed to discount anyone you disagree with an increasingly arbitrary set of rules transparently set up to arrive at the conclusion you want. When Trine 2 was on PC/360 people were jizzing about how awesome it was visually and posting screenshots to illustrate it's incredible technical performance. Dat lighting! Now that the WiiU version has content that wouldn't run on 360 it's a simple game made by dummies. How convenient.
 

USC-fan

Banned
They would likely be using some newer power gating and they clocked the chip much lower than the retail 5670. It more like comparing to the mobile 5750.
Its always a binned part....

Don't know why you're always so focused on direct TDP comparisons. For a start that GPU (the 61w card) is using 2GB of GDDR5, that stuff easily uses 20 or so watts by itself. Second the GPU is clocked at 775Mhz, downclock it to 550Mhz (29%) and you'll lower the TDP by a larger percentage then you've lowered the clock speed. Finally this is a custom part, which could easily have some legacy PC features removed in order to save power.

The idea that a 156mm2 GPU with around 1 billion transistors will be lucky to hit 350Gflops at 550Mhz is quite mad.
you have 33 watts to power the whole system, do the math.... funny you have a card that matches the specs but you over look it because it wrecks your math. The 5550...

156 = ~35mm[edram] + xxmm i/o + 104mm[ gpu core]

5550 352 glfops 39 tpd = wiiu gpu! This is best case.... looks it 550mhz too... hmmmmm
 
Here's a sweet post from earlier in the thread, to help us get back on track:
http://www.neogaf.com/forum/showthread.php?p=44897979#post44897979
I'd agree with this request. This thread is specifically for in-depth technical discussion, and not really the place for discussing how nice the Zelda demo does or doesn't look. Off-hand comments are fine, but big GIFs like those are very attention-grabbing, so inevitably draw the discussion towards them (particularly when quoted again).

Plus, you're drawing attention away from the boring spreadsheet I was about to post :p

Okay, so I realised that, even though the R700 architecture doesn't have a strict ratio between the number of SPUs and texture units, there are certain combinations which are possible, and certain ones which aren't. So, I put together a handy table on what the possible configurations are, and also what Gflops and texture fillrate values you'd get for each one:

wiiugputable.png


On the left are the number of SPUs and on the top are the number of texture units (both using "Wikipedia numbering"). The table then shows you whether a configuration is possible or not, with a green Y meaning it is a viable configuration, and a red N meaning (surprise surprise) it isn't. Then, when you've found a configuration that's viable, you can look to the column of numbers on the right to tell you how many Gflops you've got, and then you can look to the row of numbers on the bottom to tell you your texture fillrate. The number of ROPs in R700 is unconstrained by the other specifications, it just has to be a multiple of 4 and you're good (and you can multiply the number of ROPs by 0.55 to get your pixel fillrate in GP/s).

Now, back to this...
You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.

Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.

To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.

Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.
Well, you're starting with facts, I'll give you that.

MEM2 bus size is interesting, but even if you're 100% correct... it's only a problem if it's a bottleneck. Can you prove it's a bottleneck? No, at this point it's speculation. You are free to conjecture that this is so, but when developers suggest that at 720p they are not running into said bottlenecks, that conjecture rings rather hollow.

It does seem fair to say that based on the bus speeds what we know in general, the Wii U is not a "1080p console". That doesn't mean that some devs (1st party especially) won't coax great performance out of it at 1080p, but it seems reasonable to suggest that that will be the exception rather than the rule.

That the CPU is slow and underpowered is indeed reality at this point -- but as has been discussed, the choice to go with a lower powered out of order CPU and a beefier GPU for general purpose computation was a choice. The CPU being slow only matters in that it potentially makes current gen ports more difficult. As far as "next gen", the PS4 and 720 were always going to outclass Wii U by a significant margin in the raw power department. They'll also dim the lights in your house when you turn them on.
 

ikioi

Banned
Stop pretending to know what you are talking about... just because to take what other people say and reorganize it before regurgitation doesn't mean you understand it.

Thats the new argument is it? That i don't understand the specs.

Seriously can mods come in and just ban wankers like this? its one thing to have a debate, but putting up with idiots that come in here and sprout this kind of crap really get to me. Your just a petty fanboi who doesn't want to accept the reality that this system at best is slightly better then a 7 year old console.


Also, its not that hard to understand that the Wii U's MEM2 pool is slow. DDR3 is quad clocked, and double data rate, and then defined by its bus speed. DDR3 1600 has a base clock of 200mhz. 200mhz times by 4 for the quad clock equates to 800mhz. Times that by two because DDR3 is double data rate, that's 1600mhz. Then times that by the bus speed which is 16bit per moduel, with 4 moduels, so 64bit. Times 1600x64bit and you get 10.24gbs per second.


afaik, 360's edram was off chip... connected by a very bw limiting bus

The ROPs were intergrated into the eDRAM, they were removed from the GPU.

I'm not completely knowledgeable on this. But then again, I'm not going to act like I am either.

So how do you know i don't understand what i'm talking about?
 
I'm gonna throw a number out there. 70.4 GB/s This, I believe, is the bandwidth from the Wii U GPU to its eDRAM.

Earlier in the thread, UX8GD eDRAM from NEC/Renesas was brought up as a leading (almost surefire) candidate for the Wii U GPU's on-chip memory. It comes in several configurations, but one now strikes me as most likely: 4 x 8MB macros, each with a 256-bit bus. What makes this interesting is that very early on there was talk of the first Wii U dev kits containing and underclocked RV770LE. This news got our hopes up, but 640 shaders and the like are now out of the question and indeed, never seemed to be in the cards. Why wouldn't they have just used a weaker, smaller, and cooler pc part then (I'm assuming the report was true)? Well, the bandwidth of that card to its on board GDDR3 just happens to be 57.6 GB/s. What's the bandwidth of eDRAM on a 1024-bit bus at 450 Mhz (the reported clock of Wii U dev kits up until late 2011)? 57.6 GB/s. I know it's been hypothesized before, but it seems increasingly likely those first dev kits utilized that particular Radeon to simulate the bandwidth to Wii U's MEM1. Since they upped the speed to 550 Mhz, it should now clock in at 70.4 GB/s for better or worse.
 
Top Bottom