• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
Some of you guys really need to get the stick out of your asses when it comes to BG.

I've already said if you guys want to bitch at someone for not lowering the bar it should be me.

I had access to the same info as BG, and had a slightly more pessimistic outlook, but didn't post with the same degree of certainty.

From having that info back than, would you have predicted the console cost ~$300 to manufacture?
 

Goldmund

Member
Gentlemen, may I ask you to refrain from posting animated gifs and/or quoting them, unless you have a particular technical point in mind? I think we are all adults here, who can understand and respect thread topics. Thanks.
Give me a second, I'm busy counting the polygons.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
I tend to ignore that portion of the argument entirely.

Price has never been dependent on costs of manufacturing. It's always been what price people are willing to pay.

Well... what I'm getting at is if this chip set costs, say ~$200 to manufacture, would that not give us a half-step (~3 years of advancement) over the Xbox 360?

Not sure what the Gamepad, casing and disc-drive costs, but I guesstimate they are between $100-$150.
 

Thraktor

Member
Actually, if you could edit those GIFs out of your post that would probably be helpful to keep this thread focused on what it has been and not devolving into screenshot warzzzz.
Pretty please?
HjWvo.gif

I'd agree with this request. This thread is specifically for in-depth technical discussion, and not really the place for discussing how nice the Zelda demo does or doesn't look. Off-hand comments are fine, but big GIFs like those are very attention-grabbing, so inevitably draw the discussion towards them (particularly when quoted again).

Plus, you're drawing attention away from the boring spreadsheet I was about to post :p

Okay, so I realised that, even though the R700 architecture doesn't have a strict ratio between the number of SPUs and texture units, there are certain combinations which are possible, and certain ones which aren't. So, I put together a handy table on what the possible configurations are, and also what Gflops and texture fillrate values you'd get for each one:

wiiugputable.png


On the left are the number of SPUs and on the top are the number of texture units (both using "Wikipedia numbering"). The table then shows you whether a configuration is possible or not, with a green Y meaning it is a viable configuration, and a red N meaning (surprise surprise) it isn't. Then, when you've found a configuration that's viable, you can look to the column of numbers on the right to tell you how many Gflops you've got, and then you can look to the row of numbers on the bottom to tell you your texture fillrate. The number of ROPs in R700 is unconstrained by the other specifications, it just has to be a multiple of 4 and you're good (and you can multiply the number of ROPs by 0.55 to get your pixel fillrate in GP/s).
 
That 'Zelda HD' demo which was made on the very first WiiU dev kit, by a tiny team, in a short space of time using Twilight Princess assets will be absolutely blown into orbit by the full Nintendo EAD team, with a nice budget, on the final dev kits and a few years of development.

Can't wait to hear the crying after E3 2013 when it makes most of the PS4 / 720 launch games look very ordinary ;).
 
Holy crap, nice chart Thraktor. You've really turned it up a notch! About to dig into that presently.

I think it's worth noting that amidst all this madness, it is great to be able to come into this thread for rational discussion. I can't lie that I am slightly disappointed in the specs we know so far, but that does not in any way make Nintendo's hardware less intriguing.

Edit: Ok, I'm in for 360:24:12. A good 1.5 Xenos in all components and 396 Gflops total.
 

GaimeGuy

Volunteer Deputy Campaign Director, Obama for America '16
So with the WiiU we have a GPGPU with an out of order CPU to help complement the GPU.

Sort of like a hybrid where the GPU is the gas engine that really drives the car and the CPU is the electric battery there to help give the car its initial jolt and take part of the load off of the engine.

Getting the most of the Wii U will involve properly utilizing the CPU to handle code branching while allowing the GPU to handle the raw throughput once the branch is determined.

My question:
What do we know about the cache on the Wii U? I seem to recall hearing that it would actually have a large L3 cache to supplement the L1 and L2 caches. A large cache along with a large, but slow RAM pool could prove useful, as you could cache some of the CPU instruction sets for fast access while keeping the high-volume data about your game environment (world parameters, graphics and the like) in RAM for the GPU. Should only be a weakness for open world games, I would imagine
 
Holy crap, nice chart Thraktor. You've really turned it up a notch! About to dig into that presently.

I think it's worth noting that amidst all this madness, it is great to be able to come into this thread for ration discussion. I can't lie that I am slightly disappointed in the specs we know so far, but that does not in any way make Nintendo's hardware less intriguing.

Edit: Ok, I'm in for 360:24:12. A good 1.5 Xenos on all fronts.

360 GFLOPs then ?.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Okay, so I realised that, even though the R700 architecture doesn't have a strict ratio between the number of SPUs and texture units, there are certain combinations which are possible, and certain ones which aren't. So, I put together a handy table on what the possible configurations are, and also what Gflops and texture fillrate values you'd get for each one:

wiiugputable.png


On the left are the number of SPUs and on the top are the number of texture units (both using "Wikipedia numbering"). The table then shows you whether a configuration is possible or not, with a green Y meaning it is a viable configuration, and a red N meaning (surprise surprise) it isn't. Then, when you've found a configuration that's viable, you can look to the column of numbers on the right to tell you how many Gflops you've got, and then you can look to the row of numbers on the bottom to tell you your texture fillrate. The number of ROPs in R700 is unconstrained by the other specifications, it just has to be a multiple of 4 and you're good (and you can multiply the number of ROPs by 0.55 to get your pixel fillrate in GP/s).
Nice chart, but why would you do that (the bolded part)? Do you have a particular ROP impediment in mind (say, fp32 blending) or something else which might hurt a ROP's performance?
 

z0m3le

Banned
Holy crap, nice chart Thraktor. You've really turned it up a notch! About to dig into that presently.

I think it's worth noting that amidst all this madness, it is great to be able to come into this thread for rational discussion. I can't lie that I am slightly disappointed in the specs we know so far, but that does not in any way make Nintendo's hardware less intriguing.

Edit: Ok, I'm in for 360:24:12. A good 1.5 Xenos on all fronts and 396 Gflops total.

That is basically 400GFLOPs, compared to Xenos 240GFLOPs and taken into account R700's efficiency increase... that is more like 2x Xenos.

360 GFLOPs then ?.
396GFLOPs according to the chart.
The ratio is Shaders:textureunits:ROPs
 
Yup, I edited my post to clarify. I figure they've included a bit of extra oomph to make up for the lack of much vector support on the CPU side. This is even factoring in what resources the second screen sucks up (if it's used intensely that is, which we've seen so far doesn't always seem to be the case). Those numbers may have been something Nintendo targeted and then played around with clocks a bit. Just a ballpark though, given die size, performance, etc.
 

USC-fan

Banned
Well nobody asked me for input over here :p

Also, a bonus bonus: given what we now know, the worst case scenario for total bandwidth from both Wii U's pools of RAM is 81.55GB/s. If MS or Sony were to design a console with DDR3/4 but without an eDRAM equivalent (and that's a big if), then the best case scenario for total bandwidth would be 81.25GB/s.

(The DDR3/4 is assuming a 256 bit data bus at 1.3GHz (2.6GT/s))

Edit: Actually, Blu, do you have any idea as to the feasibility of what I was posting above (regarding access to DDR3 and eDRAM from the various internal components of the GPU)? Or is it hard to put into words just how wrong I am? ;)
Well the PS4 is going to have a 512 wide bus on interposer for the gpu. Going by AMD putting out this demo leak last year I think its pretty much confirm with the other leaks that say the same thing about the PS4.

Intel Haswell will be using the same type of layout with DDR4.
 

Thraktor

Member
Nice chart, but why would you do that (the bolded part)? Do you have a particular ROP impediment in mind (say, fp32 blending) or something else which might hurt a ROP's performance?

You're assuming I put more thought into this than I did. I was just reading numbers off Wikipedia, to be honest, which seems to come to 1 pixel per ROP per Hz. Also these ROP numbers are "Wikipedia numbers", which I've learnt is usually some multiple of the actual underlying hardware units (4 in this case, I think).
 
Well the PS4 is going to have a 512 wide bus on interposer for the gpu. Going by AMD putting out this demo leak last year I think its pretty much confirm with the other leaks that say the same thing about the PS4.

Intel Haswell will be using the same type of layout with DDR4.

And here we go...

As soon as it's proved more powerful than PS360, suddenly it's all about what PS4 / 720 might have lol.

I wish people would make official PS4 and 720 tech speculation threads so they stay out of WiiU threads tbh.
 
Yup, I edited my post to clarify. I figure they've included a bit of extra oomph to make up for the lack of much vector support on the CPU side. This is even factoring in what resources the second screen sucks up (if it's used intensely that is, which we've seen so far doesn't always seem to be the case). Those numbers may have been something Nintendo targeted and then played around with clocks a bit. Just a ballpark though, given die size, performance, etc.

You have been following this for a while Fourth, been nice reading your speculation even though i didn't understand most of it :p.

How close did they get from what you expected.

I myself never expected anything more than 600 GFLOPs for the GPU, a slower clocked Tri Core CPU (~2Ghz) and 3GB's of Ram, they got very close for me, im happy.

Proper First party games built from the ground up to take advantage of WiiU's hardware are going to look unreal !.
 
Gif-warz
I still don't find that impressive...

That 'Zelda HD' demo which was made on the very first WiiU dev kit, by a tiny team, in a short space of time using Twilight Princess assets will be absolutely blown into orbit by the full Nintendo EAD team, with a nice budget, on the final dev kits and a few years of development.

Can't wait to hear the crying after E3 2013 when it makes most of the PS4 / 720 launch games look very ordinary ;).

Please leave. Lol.

Anyway, is there an educated guess on what the GPU be? After we get micrgraphs, is it possible to figure it out?

That chart should help us, right? I'm sure the texture units would be visible to count, so we can rule out what they could be.
 

ozfunghi

Member
Anyway, is there an educated guess on what the GPU be? After we get micrgraphs, is it possible to figure it out?

Well, i'm basically an idiot when it comes to these things, but the scans should tell us how many SPU's the GPU has, and in combination with the now known clockspeed, that would tell us how many flops it can push.


I don't know if this has been mentioned here, but Arkam claims the CPU clock went up 25% between Devkit 3 and 4. Which would mean it previously was about 1Ghz...
 

LuchaShaq

Banned
That 'Zelda HD' demo which was made on the very first WiiU dev kit, by a tiny team, in a short space of time using Twilight Princess assets will be absolutely blown into orbit by the full Nintendo EAD team, with a nice budget, on the final dev kits and a few years of development.

Can't wait to hear the crying after E3 2013 when it makes most of the PS4 / 720 launch games look very ordinary ;).

You probably meant to post in this thread. http://www.neogaf.com/forum/showthread.php?t=501543


Please don't ruin this thread with lolgifs and nonsense like this. Has been nice for me to read before this crap.
 
Well, i'm basically an idiot when it comes to these things, but the scans should tell us how many SPU's the GPU has, and in combination with the now known clockspeed, that would tell us how many flops it can push.


I don't know if this has been mentioned here, but Arkam claims the CPU clock went up 25% between Devkit 3 and 4. Which would mean it previously was about 1Ghz...

Was Arkam the guy that said in the final version of the devkit that it had 'several hundred percent improved performance' ?.

If so those PS360 ports people are playing were prob created using sub 1Ghz CPU's and maybe weaker GPU's.
 
Well, i'm basically an idiot when it comes to these things, but the scans should tell us how many SPU's the GPU has, and in combination with the now known clockspeed, that would tell us how many flops it can push.


I don't know if this has been mentioned here, but Arkam claims the CPU clock went up 25% between Devkit 3 and 4. Which would mean it previously was about 1Ghz...

I said damn.

Also, I thought about counting the SPU's, but wouldn't there be too many to count? I guess some people will take the time... lol
 

ozfunghi

Member
Was Arkam the guy that said in the final version of the devkit that it had 'several hundred percent improved performance' ?.

If so those PS360 ports people are playing were prob created using sub 1Ghz CPU's and maybe weaker GPU's.

Arkam was the guy that came into WUST 3 or 4, can't remember, and basically slamming all the speculation into the ground in what seemed to be (to me at least) a trollish manner. Turns out he wasn't full of shit, but he got a lot of hate because of the way he went about it. At the time, expectations were higher than his claims. He did say that he wasn't a programmer himself. He is also the guy that leaked the specsheat of the "rumor: WiiU final specs" topic. He's been verified as working for a developer.

He also said in the "rumor" topic, that his team had greatly improved performance by tweaking their "360" code since he made his first claims in the WUST. (Thus admitting a little bit that the hardware wasn't as much crap as he made it out to be at first).

The guys claiming there were "hundreds" of % performance gain were Wsippel who found info that the CPU performance got a huge boost with certain middleware, and Ideaman who said that certain games saw their framerate double in addition to adding extra effects, between spring and fall 2012.

Also, I thought about counting the SPU's, but wouldn't there be too many to count? I guess some people will take the time... lol

I have no idea, i'm only parroting what i saw more tech-savvy people claim.
 

MDX

Member
I'd agree with this request. This thread is specifically for in-depth technical discussion, and not really the place for discussing how nice the Zelda demo does or doesn't look. Off-hand comments are fine, but big GIFs like those are very attention-grabbing, so inevitably draw the discussion towards them (particularly when quoted again).

Nice chart.

But Im wondering, how does Wii's
TEV translate to Texture Units?
Or Wii's TEV pipelines to SPUs?

I know that Wii basically doubled the pipelines and stages over the
GC. Just trying to see where Nintendo might be going
with its numbers.
 

MadOdorMachine

No additional functions
This thread is about (serious) technical discussion. So of course we're discussing the specs here.

Actually, if you could edit those GIFs out of your post that would probably be helpful to keep this thread focused on what it has been and not devolving into screenshot warzzzz.
Pretty please?
HjWvo.gif

Raw specs aren't an accurate measure of real world performance. Clock speed and thru-put of ram is irrelevant if you're not taking into account the number of pipes delivering them. The memory pool in PS3 is a known bottle neck for that system yet people are quick to criticize the Wii U memory for being lower. It could be that it's more optimized than PS3. That's the point in my post so I'll remove the GIFs but you have to look at the end result.
 

USC-fan

Banned
And here we go...

As soon as it's proved more powerful than PS360, suddenly it's all about what PS4 / 720 might have lol.

I wish people would make official PS4 and 720 tech speculation threads so they stay out of WiiU threads tbh.

This thread is for technical discussion, thanks. Don't need this nonsense.
 

M3d10n

Member
Nice chart.

But Im wondering, how does Wii's
TEV translate to Texture Units?
Or Wii's TEV pipelines to SPUs?

I know that Wii basically doubled the pipelines and stages over the
GC. Just trying to see where Nintendo might be going
with its numbers.

The TEV weren't exactly texture units, they are very similar to NVidia's Register Combiners introduced in the TNT and pre-shader GeForce GPUs. They are pixel shaders' direct ancestors, allowing the games to "program" how multiple texture layers and other inputs were combined to produce the final pixel color.
 
Arkam was the guy that came into WUST 3 or 4, can't remember, and basically slamming all the speculation into the ground in what seemed to be (to me at least) a trollish manner. Turns out he wasn't full of shit, but he got a lot of hate because of the way he went about it. At the time, expectations were higher than his claims. He did say that he wasn't a programmer himself. He is also the guy that leaked the specsheat of the "rumor: WiiU final specs" topic. He's been verified as working for a developer.

He also said in the "rumor" topic, that his team had greatly improved performance by tweaking their "360" code since he made his first claims in the WUST. (Thus admitting a little bit that the hardware wasn't as much crap as he made it out to be at first).

The guys claiming there were "hundreds" of % performance gain were Wsippel who found info that the CPU performance got a huge boost with certain middleware, and Ideaman who said that certain games saw their framerate double in addition to adding extra effects, between spring and fall 2012.

I have no idea, i'm only parroting what i saw more tech-savvy people claim.

There was a guy that came into EC's WiiU tech spec thread about three weeks ago and said there was 'several hundred percent increased performance' from the second last dev kits and that there were another round of updates in the final dev kit that were 'worth mentioning' in relation to performance in the latest 'patch notes'.

Can't for the life of me remember his name, i asked if he was a trusted 'source' or 'insider' and was told that he was reliable as he only posted trust worthy updates.

If that is indeed true then i think the launch games / ports were created on much weaker than final devkits, so we haven't seen what the console is truly capable off yet.
 
This thread is for technical discussion, thanks. Don't need this nonsense.

Ok Mr 'WiiU will under no circumstances use a GPGPU' for three months straight right before Iwata confirms it uses a GPGPU haha. In short don't listen to a word this imbecile spews about hardware, he just parrots things he has read from Beyond3D like he is some expert.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Ok Mr 'WiiU will under no circumstances use a GPGPU' for three months straight right before Iwata confirms it uses a GPGPU haha. In short don't listen to a word this imbecile spews about hardware, he just parrots things he has read from Beyond3D like he is some expert.
Guys, are you purposefully trying to drag down this thread to the level a couple of other originally-tech threads have already reached, or is that your tempers you can't control? Seriously. It does not matter what some gaffer said or did not say in another thread, or on another forum, or in a pm. We all have our biases, things that please us and things that irk us. If you partake in a thread that has certain plain requirements in its topic - try to comply. Please?
 

ozfunghi

Member
There was a guy that came into EC's WiiU tech spec thread about three weeks ago and said there was 'several hundred percent increased performance' from the second last dev kits and that there were another round of updates in the final dev kit that were 'worth mentioning' in relation to performance in the latest 'patch notes'.

That would be Wsippel. He's not a dev but he is a programmer i believe and he does his homework. Correct me if i'm wrong Wsippel.
 

StevieP

Banned
That would be Wsippel. He's not a dev but he is a programmer i believe and he does his homework. Correct me if i'm wrong Wsippel.

http://www.neogaf.com/forum/showpost.php?p=44133677&postcount=8584
wsippel said:
It gets more and more abstract, but several months ago, I posted that a certain middleware dev reached 360-levels of performance on Wii U. Another update brought further optimizations. Then, the next big update increased the performance by several hundred(!) percent. And a few days ago, there was another round of Wii U specific optimizations significant enough to be mentioned in the changelog. "Enhanced Broadway"? Really?

Was this maybe referring to whichever pieces of the hardware were "locked out" until recently according to known source Matt?
 

IdeaMan

My source is my ass!
There was a guy that came into EC's WiiU tech spec thread about three weeks ago and said there was 'several hundred percent increased performance' from the second last dev kits and that there were another round of updates in the final dev kit that were 'worth mentioning' in relation to performance in the latest 'patch notes'.

Can't for the life of me remember his name, i asked if he was a trusted 'source' or 'insider' and was told that he was reliable as he only posted trust worthy updates.

If that is indeed true then i think the launch games / ports were created on much weaker than final devkits, so we haven't seen what the console is truly capable off yet.

Well, i said that since months :p

And it's even more the case for the CPU. As i hinted a moment ago now, a big studio managed to self-hinder how much throughput they could retrieve from the CPU. I would say between 20 and 40% to be more precise. It's huge yes. It's related to the way their engine put the U-CPU to use. It was corrected since then, but very lately in the development cycle. Now it doesn't mean every studio encountered this issue, but at the very least that it's an enough "unique" architecture that requires some adaption/learning. This studio developed on Wii also so they already knew Broadway CPU, therefore i doubt they would have met this difficulty with the U-CPU if it was just a three-core Broadway (although it's a possibility, they may just have messed up, that's all).

When you combine all that have been revealed on the learning curve + improvements (either hardware with dev kit revisions, or software with SDK, etc.) these past months + the fact that the system can run late-gen-ports + plenty of other info (between the fab process, the eDram, how mercan analyzed this CPU through hacking, "softwarely", and not directly the circuitry, so it may only means that it's compatible with Broadway, etc etc.) + reassuring comments from several sources (albeit countered by other negative ones, but that means there's a correct way to use it, in balance with the other components) = there's more to it than just "lol 1.2ghz 3-core Broadway" + above all, that the second-gen titles on Wii U will start from saner grounds that the firsts who were constrained by several not optimized factors (however, you could say that for most systems).

Now, i'm not defending Nintendo hardware choices, i would have gladly exchanged 30euros+bigger casing for a CPU that directly delivers more grunt without having to deploy huge efforts of optimizations here and there (by using the GPGPU aspect of the GPU if it's really a GPU-centric system, the DSP, ARM chip, etc.). It would have warranted better launch window ports, the superior versions (at least a tad) that i expected.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
Well, i said that since months :p

And it's even more the case for the CPU. As i hinted a moment ago now, a big studio managed to self-hinder how much throughput they could retrieve from the CPU. I would say between 20 and 40% to be more precise. It's huge yes. It's related to the way their engine put the U-CPU to use. It was corrected since then, but very lately in the development cycle. Now it doesn't mean every studio encountered this issue, but at the very least that it's an enough "unique" architecture that requires some adaption/learning. This studio developed on Wii also so they already knew Broadway CPU, therefore i doubt they would have met this difficulty with the U-CPU if it was just a three-core Broadway (although it's a possibility, they may just have messed up, that's all).

When you combine all that have been revealed on the learning curve + improvements (either hardware with dev kit revisions, or software with SDK, etc.) these past months + the fact that the system can run late-gen-ports + plenty of other info (between the fab process, the eDram, how mercan analyzed this CPU through hacking, "softwarely", and not directly the circuitry, so it may only means that it's compatible with Broadway, etc etc.) + reassuring comments from several sources (albeit countered by other negative ones, but that means there's a correct way to use it, in balance with the other components) = there's more to it than just "lol 1.2ghz 3-core Broadway" + above all, that the second-gen titles on Wii U will start from saner grounds that the firsts who were constrained by several not optimized factors (however, you could say that for most systems).

Now, i'm not defending Nintendo hardware choices, i would have gladly exchanged 30euros+bigger casing for a CPU that directly delivers more grunt without having to deploy huge efforts of optimizations here and there (by using the GPGPU aspect of the GPU if it's really a GPU-centric system, the DSP, ARM chip, etc.). It would have warranted better launch window ports, the superior versions (at least a tad) that i expected.
What's your NNID, IdeaMan? I've been trying to look you up all week but Miiverse finds nothing. :(
 

Thraktor

Member
could the GPGPU make up for the lack of new SIMDs?

Lack of SIMD units in the CPU? Yes, sort of, or at least that seems to be Nintendo's intent. It is the case nowadays (as I said in the other thread) that when it comes to streaming SIMD-heavy code, GPUs handily beat CPUs in the Gflops/watt and Gflops/mm^2 metrics, which are metrics you want to optimise when designing a games console with a strict budget and a strict power/heat envelope to stay in. That said, I think that it's still worth having some SIMD functionality on the CPU to handle code that wouldn't run well on the GPU but would still benefit from proper SIMD support. For that reason I assumed that Nintendo would add a SIMD unit (or an A2-style FPU/SIMD hybrid) to just one of the cores, to handle those kinds of tasks, although it seems that this isn't the way they've gone. We'll have to wait and see whether that was a good idea or not. That said, Blu posted a test of Broadway's capabilities running SIMD-heavy code a page or two ago, and it actually didn't perform as badly as you would have expected (and that's without even using paired singles), so it might not be as big a deal as I'd been assuming.

Also, I'm actually starting to think that the crazy 550GB/s eDRAM bandwidth might not be quite as crazy as I'd originally thought. I was reading through a description of XBox360's memory subsystem, and it occurred to me that the 256GB/s bandwidth from ROPs to eDRAM wasn't actually overkill in the way I was assuming. I thought that it was simply a matter of a 4096 bit interconnect being the only possible configuration of 10MB eDRAM at the time, and hence MS just went with it and only used a few tens of GB/s of it. In fact, it's the other way round. The 256GB/s is the maximum theoretical throughput they calculated the ROPs would use up, so any less than that could become a bottleneck (and in this particular case, a 4096 bit interconnect isn't really all that expensive).

That brings me to the Wii U. Let's just take as an assumption that my block diagram on memory access was correct, and that we're looking at a 420:24:12 configuration of the GPU (that's SPUs:Texture units:ROPs, in Wikipedia numbering). So, there would be three ROP "bundles", six texture unit "bundles", and each texture unit bundle would be aligned with an array of 70 SPUs. So, if the XBox360's 2 ROP units (8 ROPs by Wikipedia numbering) require a 2048 bit interconnect each without bottlenecking, then surely Wii U's 3 ROP units would require the same? That comes to 6144 bits of interconnect being taken up just by the ROPs to keep per-ROP parity with XBox360. Then we've got the LSUs attached to the SIMD arrays, of which we've got six. Let's say you want to give each of them a 256-bit connection, which isn't so crazy when you think each one's feeding 77 GFlops of processing power, and you're up to a total of 7680 wires coming from the eDRAM. Add a 512 bit connection to the CPU (which is feasible with the GPU and CPU on an MCM) and you're up to a 8192 bit wide total interconnect and 550GB/s of theoretical bandwidth. Of course the total 550GB/s would never be reached in the real-world (not even close), but it might need to be that high to give each component the bottleneck-free connection it requires.

You'd then have a distribution of bandwidth that looks like this:

ROPs: 2048bit / 137.5GB/s x 3 = 6144bit / 412.5GB/s
SPUs: 256bit / 17.2GB/s x 6 = 1536bit / 103.1GB/s
CPU: 512bit / 34.4GB/s

Of course, I wouldn't necessarily bet on the 8192 bit scenario being true (for example you could divide all those numbers by two to get a 4096 bit scenario which might work, for all I know), but after reading up on the XBox360 a bit, I can no longer rule it out completely.

Anyway, after a year and a half of speculation and discussion with you fine folks, I finally have a Wii U sitting in a box next to me, and I'm about to head home to plug it in and start playing. So you'll have to excuse me if I start posting a whole lot less for the next few days :)
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.

DynamicG

Member
Question for those in the know, what does the information on WiiU CPU mean for non-game performance? Could that be a reason for the slow OS or is it more a software design issue?
 

Earendil

Member
Question for those in the know, what does the information on WiiU CPU mean for non-game performance? Could that be a reason for the slow OS or is it more a software design issue?

Odds are it's a software issue. There's no reason this CPU couldn't handle whatever OS you throw at it, as long as it's properly designed and coded.
 
This thread is an oasis in a desert of murky fanboyism. Thanks to all who are getting down to the knitty gritty.

I agree. I usually avoid posting in these threads but I always read them. I've really enjoyed this one, especially with the focus remaining on the Specs (as much as possible). Just had to post a big THANKS! to those putting their time and effort into finding out as much as possible about the Wii U hardware!

I'll go back to lurking in these threads now :)
 
Odds are it's a software issue. There's no reason this CPU couldn't handle whatever OS you throw at it, as long as it's properly designed and coded.
Is there anything they can do? Cause it's sloooow. Too slow. It's a lot of waiting and that makes it less fun to use.
It gets annoying from the first moments you use it.
 
Top Bottom