• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
As much as I hate doing this..

You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.

Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.

To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.

Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.



There's no smoke without fire. In the case of the Wii U's cpu, the house is well and truely alight We've seen Dice slam it, Crytech slam it, unnamed sources months ago slam it, even developers publically comment on how it was an obstacle they had to work around.

There's also no denying the CPU is based on decade plus old IBM PPC 750 architecture, and has the transistor count of a single Intel atom core. It also has an incredibly low TDP.




Technical knowledge from a small time indy developer who has made one simple small game for the e-store. They also just so happen to only make games for Nintendo hardware.

Yeah totally indicative of the Wii U's performance and an unbais source.

Also you're just as full of the rhetric as anyone else here. You come into this thread criticising people for their arugments and views, yet offer none of your own.
Pal, no offense, but your posts (like the one above) have contributed nothing to this thread. On the contrary, they've brought the level of discussion down several notches.

You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.

You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.

You end up your posts with 'Fuck you, console vendor X'.

Have you considered the possibility you might be in the wrong thread, as per your current state of mind?
 

Kenka

Member
If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.
 

nordique

Member
The connection between the 360 eDRAM parent and daughter die is 32GB/s, the connection between parent and main memory is 22 GB/s - which you have to do through if you are going to do something useful with what you wrote to eDRAM.

The GPU cannot read from eDRAM, the read bandwidth there is a whopping zero.

Calling it "fast" and "high bandwidth" and the WiiU implementation you suggest slow and low-bandwidth is both an apples to oranges comparison and a purely semantic argument.

I'm pretty sure nobody has used RAM "slowness" to mean "throughput to ROPs" ( you mean "from" ROPS?) until your elaborate backtrack just now.

Before this elaborate clarification you said:



Now you seem to be saying that you meant write-only bandwidth from ROPs instead, which I'm pretty sure cannot really be called a "bus" at all, certainly not more so than two other relevant actual busses.


Just to potentially add to this, and in general, regarding the "on-paper" 360 bandwidth that is brought up against the Wii U, its important to consider that the 360 bandwidth seems to be limited from theoretical performance to 10.8GB/s (aggregated upstream/downstream bandwidth, or so the tech speak results from google searches tell me) in real world wrote situations. I don't know if any 360 developer here can confirm that but I also recall reading in the past during the heyday of hd console discussion that the real world performance of the two hd "twin" wasn't exactly what the spreadsheets said.

That is, Many of the on paper specs from the 360 and PS3 were really bloated compared to real world performance.
 

nordique

Member
If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

It should be possible to port a game across all three, and all "next gen engines" should be fairly scalable. Whether a game goes to wii u is less dependant on the tech side and more so on the publishers dime - I mean dead rising was ported to the Wii because capcom decided so.

If you're concerned about the games coming out influencing a sole wii u purchase I would suggest you wait until the other two are revealed and decide for yourself.
 

Kenka

Member
It should be possible to port a game across all three, and all "next gen engines" should be fairly scalable. Whether a game goes to wii u is less dependant on the tech side and more so on the publishers dime - I mean dead rising was ported to the Wii because capcom decided so.

If you're concerned about the games coming out influencing a sole wii u purchase I would suggest you wait until the other two are revealed and decide for yourself.
Yes, this seems wiser. It's a bit of a bummer that the entire future library could be solely built on good relationships between Nintendo and third parties, since Nintendo doesn't care much anyway.
 

Durante

Member
If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.
You want a simple answer to a complex issue.

Of course it's possible to create a game targeting all 3 platforms. However, it's also possible to create a game targeting everything from 3DS to PS4.

Of course it's also possible to create a game that targets PS4 and 720, and is extremely hard to port down to Wii U.

It's not a yes/no question, it's very much a sliding scale. The only thing anyone can do to provide a general answer is estimate the relative difficulty of multi-platform development.

And anyway, if your question is really "will these games come to Wii U", it may actually be at least as important to look at sales threads as it is to look at this thread.
 

ozfunghi

Member
If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

That answer is open to interpretation i think. Without wanting to come across as presumtious, because i'm not the guy with the info, i think that will almost always depend on the game, the developer and the publisher, in more ways than that the actual hardware would be a clear cut case of being able to get these ports or not.

So the hardware is weaker, obviously games that are going to be pushing the other systems, or rely in such a way on certain strengths of that hardware, are not easily ported down. Should Wii have gotten a Dead Rising port? Should Xbox have gotten a Half-Life 2 port? Should PS360 have gotten Far Cry 3? Obviously, anything "can" be ported, the question is, should it be ported under any and all conditions. There will be cames that "should" not be ported to WiiU, for sure. But i think many games "could" be ported as well, meaning there should be no real obstacle to do so other than somewhat downsized graphics. But is the developer up for it. I have a feeling if you'd ask the guys over at Arkam's or Lherre's studio they seem rather sceptical and/or dismissive. While if you'd ask the guys from Shin'en, i bet they'd say it is possible in many cases, given the proper resources. And yet, if publishers now aren't even considering WiiU for CURRENT generation games regardless of hardware constraints, well, how much hope should we have for the future?
 

tipoo

Banned
Well we have the full hardware diagram for XBox 3 which describes some pretty small details yet there's no mention of a DSP. Also sound may be described as trivial for modern PC CPU's such as Phenom II and Intel I series, but Jaguar isn't in that league. Wouldn't surprise me if audio could take up as much as a whole Jaguar core or close enough.

Xbox3's Jaguar CPU is clocked 29% faster, but its got a much longer pipeline, 14 stages compared to Espresso's 4, not saying that gives WiiU's CPU the advantage core to core or even makes it 100% as fast (we really can't compare exactly). But it should help to counteract the lower clock speed.

One whole jaguar core for sound doesn't strike me as right, Jaguar isn't as powerful as Bulldozer or Ivy Bridge but if it takes a few percentage points of one of those cores to use, I can't imagine it taking 100% of a Jaguar core. It's slower, but not by so much.

And if it did, why wouldn't they include a DSP which is so tiny and cheap to produce?

EDIT: According to this, the nextbox does have audio acceleration

http://www.eurogamer.net/articles/df-hardware-next-gen-xbox-specs-leak
 
WiiU DDR3 bandwidth is 12.8 GB/s (assuming 1.600 GHz data rate x 64Bit Interface /8 bit per byte)

On the topic of VLIW (Wii U) vs non-VLIW-SIMD (Durango/Orbis) Architecture differences

Anandtech said:
AMD Graphics Core Next: Out With VLIW, In With SIMD

The fundamental issue moving forward is that VLIW designs are great for graphics; they are not so great for computing. [...]

The principal issue is that VLIW is hard to schedule ahead of time and there’s no dynamic scheduling during execution, and as a result the bulk of its weaknesses follow from that. As VLIW5 was a good fit for graphics, it was rather easy to efficiently compile and schedule shaders under those circumstances. With compute this isn’t always the case; there’s simply a wider range of things going on and it’s difficult to figure out what instructions will play nicely with each other. Only a handful of tasks such as brute force hashing thrive under this architecture.

Furthermore as VLIW lives and dies by the compiler, which means not only must the compiler be good, but that every compiler is good. This is an issue when it comes to expanding language support, as even with abstraction through intermediate languages you can still run into issues, including issues with a compiler producing intermediate code that the shader compiler can’t handle well
In short, for VLIW scheduling (deciding which instructions are to be executed when) needs to be done by the compiler, while non-VLIW allows dynamic sheduling during runtime, so it can be used more effiently especially for compute applications
 
Thought I could contribute something to this thread that isn't about eDRAM or GPGPUs :p

One specific area where the WiiU can potentially (game dependent of course) gain a bit of performance compared to the 360 and PS3 is skinning/animation.

Some rough background info for those not familiar with it. Animation is in 90% of cases done using joints/bones. These consist of transformation (ie. rotation) information set within a hierarchical structure, called a skeleton. Being hierarchical means that if you move the parent, all the children move along with it. Rotate the elbow and the arm, hand, fingers all go together.
Animators manipulate this skeleton into various animation frames that are interpolated and played back at runtime.

Skinning is the process where you take the joint data and the 3D model, and transform each vertex (3D point that are linked together to define polygons) in the model by the joints that affect it. Vertices in the hand are affected by the hand joints, etc.

This is a relatively expensive process, more expensive the more joints influence a vertex. Many games limit the number of influences to 4.

Now, on to the platform specific part :)

A generic way to do the skinning, is to use the vertex shader. A joint palette is sent to the shader, together with the influence weights and the rest of the normal shader data. The GPU then crunches everything together to produce the final vertex position.
The problem with this method comes with the fact that you usually need to render the same model more than once. For example, for rendering into the shadow map, rendering into a GBuffer, etc. That means the GPU has to do the skinning more than once, which is a bit wasteful.

On the PS3 for example, you have an different option. You can use the SPUs to do the skinning very efficiently, store the resulting data, and pass that to the GPU. While this is great for the GPU, especially if you are vertex bound, it requires extra RAM to store the data. Depending on the game, you might not be able to spare that memory.

On the WiiU, you can use the GPU's ability to stream out data to export the skinning data in the same way as the SPUs on the PS3, and reuse it later. The advantage on the WiiU is that you have a lot more memory, so storing this data should be easy enough for most cross platform titles.

Sorry for the mega post, hope this is interesting to someone.
 

Donnie

Member
One whole jaguar core for sound doesn't strike me as right, Jaguar isn't as powerful as Bulldozer or Ivy Bridge but if it takes a few percentage points of one of those cores to use, I can't imagine it taking 100% of a Jaguar core. It's slower, but not by so much.

And if it did, why wouldn't they include a DSP which is so tiny and cheap to produce?

EDIT: According to this, the nextbox does have audio acceleration

http://www.eurogamer.net/articles/df-hardware-next-gen-xbox-specs-leak

I don't think these cores really compare to Bulldozer or Ivy Bridge to be honest. One Bulldozer core is about the size of 5 Jaguar cores (well the same size as 7 but there's a process difference to take into account so 5 is more accurate). As far as audio processing goes it can be surprising how much it can hit a CPU. I remember it being argued that no way would audio even use a significant portion of a 3.2Ghz Xenon core, in reference to talk of WiiU's DSP. Until a MS engineer confirmed it often takes one entire core to process audio. Now I'm not saying audio processing would take an entire Jaguar core, just that it wouldn't surprise me if it could take close to that, these aren't powerful cores after all.

You're right, that article does mention audio codecs, though I was under the impression that was for Kinect? I suppose we'll have to wait and see to what extent its used within the system.

If they haven't included a DSP for games then I'd guess the reason would be because they have 8 CPU cores there. They could have included a DSP for 360 but they had a triple core CPU so decided to let that do everything.
 

Donnie

Member
I'm gonna throw a number out there. 70.4 GB/s This, I believe, is the bandwidth from the Wii U GPU to its eDRAM.

Earlier in the thread, UX8GD eDRAM from NEC/Renesas was brought up as a leading (almost surefire) candidate for the Wii U GPU's on-chip memory. It comes in several configurations, but one now strikes me as most likely: 4 x 8MB macros, each with a 256-bit bus. What makes this interesting is that very early on there was talk of the first Wii U dev kits containing and underclocked RV770LE. This news got our hopes up, but 640 shaders and the like are now out of the question and indeed, never seemed to be in the cards. Why wouldn't they have just used a weaker, smaller, and cooler pc part then (I'm assuming the report was true)? Well, the bandwidth of that card to its on board GDDR3 just happens to be 57.6 GB/s. What's the bandwidth of eDRAM on a 1024-bit bus at 450 Mhz (the reported clock of Wii U dev kits up until late 2011)? 57.6 GB/s. I know it's been hypothesized before, but it seems increasingly likely those first dev kits utilized that particular Radeon to simulate the bandwidth to Wii U's MEM1. Since they upped the speed to 550 Mhz, it should now clock in at 70.4 GB/s for better or worse.

Wasn't WiiU's GPU originally 400Mhz? Sorry to ruin the hypothesis a bit but I do remember it being mentioned that the system was originally 400Mhz GPU and 1GHZ CPU.
 

ozfunghi

Member
Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol
 

tipoo

Banned
Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

You mean this?

http://www.youtube.com/watch?feature=player_embedded&v=K40orKpbJBU

It does look pretty good, but still nothing that would be mind blowing on the HD 7th gen consoles imo. The textures and geometry are obviously far more than what the Wii could do, but that's about all I could say.
 

tipoo

Banned
I don't think these cores really compare to Bulldozer or Ivy Bridge to be honest. One Bulldozer core is about the size of 5 Jaguar cores (well the same size as 7 but there's a process difference to take into account so 5 is more accurate). As far as audio processing goes it can be surprising how much it can hit a CPU. I remember it being argued that no way would audio even use a significant portion of a 3.2Ghz Xenon core, in reference to talk of WiiU's DSP. Until a MS engineer confirmed it often takes one entire core to process audio. Now I'm not saying audio processing would take an entire Jaguar core, just that it wouldn't surprise me if it could take close to that, these aren't powerful cores after all.

You're right, that article does mention audio codecs, though I was under the impression that was for Kinect? I suppose we'll have to wait and see to what extent its used within the system.

If they haven't included a DSP for games then I'd guess the reason would be because they have 8 CPU cores there. They could have included a DSP for 360 but they had a triple core CPU so decided to let that do everything.


That's true, but I suspect Jaguar is still much better core for core than Xenon even at half the clock speed, Xenon had some pretty big inefficiencies (remember that was during the Pentium 4 era). And from what I understood of the article, the sound accelerator in Durango would handle noise cancellation for Kinect as well as other audio encoding/decoding.

By the way, was it ever settled whether audio on 360 was taking one physical core, or just one thread of which each core has two of? And even with 8 cores it seems like Microsoft wouldn't want to waste 1/8th of its power just doing audio when a piece of silicon that costs so little and draws so little power could do the same. SoCs even have them integrated on-chip, they're so trivially small now.
 
I wanted to come and post how awesome the new games looked.

amekbv.gif


awajv0.gif
 

pestul

Member
Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

I don't think it's spectacular, but the open-world is very promising. Also the fact that it's probably an online multiplayer game with those visuals is quite impressive.
 
It's not anything too fantastic, but you can tell there is a fidelity increase over PS3 and 360. I don't know though. Xenoblade looked freaking awesome on Wii, and we know how weak that device is. Maybe it's partly art direction? Looks great anyway.
 

Van Owen

Banned
Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

It looks like it could be done on 360.

Not to say it looks bad per say.
 

Donnie

Member
That's true, but I suspect Jaguar is still much better core for core than Xenon even at half the clock speed, Xenon had some pretty big inefficiencies (remember that was during the Pentium 4 era). And from what I understood of the article, the sound accelerator in Durango would handle noise cancellation for Kinect as well as other audio encoding/decoding.

By the way, was it ever settled whether audio on 360 was taking one physical core, or just one thread of which each core has two of? And even with 8 cores it seems like Microsoft wouldn't want to waste 1/8th of its power just doing audio when a piece of silicon that costs so little and draws so little power could do the same. SoCs even have them integrated on-chip, they're so trivially small now.

Jaguar could very well be faster in many ways than Xenon core for core even at 1.6Ghz. Though I don't think its going to blow it away or anything.

It was definitely one core, though I think he said that some much less complex games got away with one thread.

I completely agree that a DSP sounds the right way to go, hopefully they have gone that way for their own sake, because IMO its a waste not to.
 

ozfunghi

Member
It looks like it could be done on 360.

Not to say it looks bad per say.

Yeah, i'm specifically talking about the scope, i don't know if there is a 360 game that pushes the same scale with that level of detail (not saying there isn't, just that i don't know).

PS: don't want to act like a douche, but you're the 3rd person i've seen posting it this week. It's not "per say", it comes from latin, it's "per se" (in/by itself). It specifally comes off strange since i'm not a native English speaker, yet we also use the expression "per se" in our language, so it stands out more i guess when reading it spelled like that.

I wanted to come and post how awesome the new games looked.

Blu will have your blood.



Posting screens/gifs inevitably lead to other people showing other pictures of other games on other hardware to compare and a flame fest follows. Come to think of it, maybe he didn't like me mentioning the game at all either, lol.
 

tipoo

Banned
Yeah, i'm specifically talking about the scope, i don't know if there is a 360 game that pushes the same scale with that level of detail (not saying there isn't, just that i don't know).

I can't think of one at the moment, but GoW 3 on PS3 comes to mind. If you can really fly to any point in the horizon you see there that is huge though.
 
Wasn't WiiU's GPU originally 400Mhz? Sorry to ruin the hypothesis a bit but I do remember it being mentioned that the system was originally 400Mhz GPU and 1GHZ CPU.

Doesn't ruin it really, as the bandwidth of the Radeon card would be used as a target. We know the early dev kits had to be clocked below target because of overheating.
 
It looks like it could be done on 360.

Not to say it looks bad per say.

Well unless you have a 360 game in mind that looks as nice with that scale (and really few games this gen have this sort of draw distance, except for probably AC, RDR and Skyrim and this looks better than those), I'm not sure it could



Well it's your word versus AN ACTUAL WII U DEVELOPER. I wonder whose has more weight (well, really, any weight)?
 

guek

Banned
Here's the thing: even if the Wii U is a step up above 360/PS3, it's not far ahead enough to look like anything close to a "next gen leap". So people like Van Owen will always be able to say "eh, doable on last gen" without anything other than casual observance as their evidence.

Expect to hear it a lot in the coming years, regardless of how Wii U games look. You could have something like 1313 locked at 30fps/720p/no AA and people would say "meh, doable on 360" because hey, it wouldn't be far from the truth.
 
Where did they suggest it is?

Can't remember the thread title, there may be a link in this thread. But, its an interview done by Ideaman, particularly about the WiiU memory architecture the dev or devs reveal eDRAM, bandwidth without being specific. XXXGB/S being the non specific answer. This was a response to questions about DDR3 low bandwidth.
 
Can't remember the thread title, there may be a link in this thread. But, its an interview done by Ideaman, particularly about the WiiU memory architecture the dev or devs reveal eDRAM, bandwidth without being specific. XXXGB/S being the non specific answer. This was a response to questions about DDR3 low bandwidth.

I think you're confusing Ideaman's own speculations with what the developer actually said. I've read both the NES interview with Shin'en and the "Not Enough Bandwidth" articles by IM and nowhere does the developer allude to XXXGB/s bandwidth. That doesn't mean it's not true - we really don't know yet. But I suspect it's lower.
 
I think you're confusing Ideaman's own speculations with what the developer actually said. I've read both the NES interview with Shin'en and the "Not Enough Bandwidth" articles by IM and nowhere does the developer allude to XXXGB/s bandwidth. That doesn't mean it's not true - we really don't know yet. But I suspect it's lower.

Actually it was in the Not Enough Bandwidth article, which brings up another good point not addressed here. I noticed in comments people were calling out the claim that the Wii U only had 1 RAM bus when both the GC and Wii had 2. Was this just speculation or was this something seen in an actual "breakdown" (or whatever partial breakdown they've done)?
 
Actually it was in the Not Enough Bandwidth article, which brings up another good point not addressed here. I noticed in comments people were calling out the claim that the Wii U only had 1 RAM bus when both the GC and Wii had 2. Was this just speculation or was this something seen in an actual "breakdown" (or whatever partial breakdown they've done)?

From what I've read of those comments, it sounds like angry fanboys in denial. It's amazing what cognitive dissonance can do. Gamecube had an additional bus to its ARAM, which was pretty much slow as molasses. Wii had a bus on the Hollywood MCM to the 24 MB 1t-SRAM and another bus to its GDDR3. Wii U has no need for an additional external bus as it contains one unified pool of DDR3. The 32 MB eDRAM is on-chip. The internal bandwidth from the GPU to that pool is still a mystery, but I have suggested 70.4 GB/s.

Edit: I just read those comments again. Wow. The posters do no seem to grasp that each RAM chip carries a 16-bit interface. Anything more than a single 64-bit channel is impossible with that setup.
 

ikioi

Banned
You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.

I explained how. I'm so use to dealing with PC memory that i stupidly input 64bit per module, rather then 16bit. I'm so use to dealing with DDR3 on a 64bit bus.

You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.

No i didn't. I'm not even sure where this came from

Care to quote a post of mine where i said the ROPs are tied to the DDR3 and not the eDRAM?

What i said was unlike the Xbox 360 which had the ROPs and eDRAM intergrated into one die with an incredibly high bandwidth between the two. I believe the Wii U's eDRAM is on a slower bus, intergrated directly within the GPU die, but it has far higher overall bandwidth to the remainder of the GPU like its shaders. The DDR3 in the Wii U is also quite slow. As such there would be no justifable reason for Nintendo and AMD to include more then 8 ROPs in the Wii U's GPU.

That is what i said. I have NFI where this 'ROPs tied to DDR3' bullshit comes from. I never said that. In fact what i said about the DDR3 was due to its slow bandwidth there's no way it could be used to feed ROPs. So i said the complete opposite of what you claim. I looked at the DDR3 and ruled it out as being usable for the ROPs.


Have you considered the possibility you might be in the wrong thread, as per your current state of mind?

Absolutely. The Wii U's eDRAM could be insanely faster then my sub <100 gigabyte per second belief. Resena do seem to have the tech to do faster, a lot faster in fact. But if that was the case why have we seen no multi platform games take advantage of the larger eDRAM pool to offer improved AA and AF on Wii U versions? I've seen nothing to date that suggests the Wii U's eDRAM has data throughput to its ROPs that is similar to the Xbox 360s.

And the final point you ended up at has essentially nothing at all to do with the original point in your rant.

As per the above.

My argument was that the Wii U's GPU appears to have 8 ROPs. I based this on my belief that the Wii U's eDRAM pool isn't as fast as the Xbox 360's pool was to its ROPs. Over all the Wii U's eDRAM pool is likely faster to the remainder of the its GPU, but slower to the ROPs. Also the 64bit DDR3 is also slow, which again hurts bandwidth and means adding extra ROPs would be pointless anyway. ROPs require bandwidth to work, lots of it. I don't believe the Wii U's eDRAM or DDR3 pool can provide enough bandwidth to justify more then 8 ROPs.

An actual analysis shows that there are a lot of pros and cons to the 360 setup, not the "lolz WiiU slow eDRAM 360 fast eDRAM lolz" nonsense you are putting out.

Of course there are, i never said the Xbox 360 was some gem of engineering and efficiency.
 
Don't know if this is the right place to ask a question but I'll do it anyway since it relates to the Wii U.

Can the Wii U play Wii,Gamecube,SNES,NES, and other Nintendo console games? Really stupid question I know but i haven't payed much attention to the Wii U since I lost interest in Nintendo with the release of the WiiU.

If the answer to my question is yes than I have another question.

I'm somebody who has never been into Nintendo,Mario,Zelda,etc. Since all these new Mario games and this new Zelda HD is coming out I've felt a need to tryout these Nintendo Classics. I'm trying to capture the entire experience of every Nintendo Franchise, especially Zelda and Mario. Would I be able to play most if not all of the games for each franchise with just the Wii U and is the Nintendo Franchises good enough to spend $400-500 dollars on if I've never played a SINGLE one of them.
 

SmokeMaxX

Member
Can the Wii U play Wii,Gamecube,SNES,NES, and other Nintendo console games? Really stupid question I know but i haven't payed much attention to the Wii U since I lost interest in Nintendo with the release of the WiiU.

If the answer to my question is yes than I have another question.

I'm somebody who has never been into Nintendo,Mario,Zelda,etc. Since all these new Mario games and this new Zelda HD is coming out I've felt a need to tryout these Nintendo Classics. I'm trying to capture the entire experience of every Nintendo Franchise, especially Zelda and Mario. Would I be able to play most if not all of the games for each franchise with just the Wii U and is the Nintendo Franchises good enough to spend $400-500 dollars on if I've never played a SINGLE one of them.

Can natively play Wii games. Can't natively play the others. Virtual console allows you to play SNES/NES games (as well as Gameboy, Gameboy advance) and I think N64 games. I'm sure Gamecube games will come along as well.

Nintendo games should definitely be played and I'm sure you can get a good experience of most games on it. There are certain franchises that are "meh" on the console (Pokemon, for one). Not sure if they've released it to VC but I doubt it. There are some experiences that are a little different than the original experience (i.e. n64 games).
 
It wouldn't make no sense for Nintendo to do this considering the bandwidth requirements of shaders, I think devs chose to do straight ports, these were small teams, that spent more time on Gamepad ideas. Then it would have been pointless to put 32mb of eDRAM, and keep the bus small.
 

ozfunghi

Member
Don't know if this is the right place to ask a question but I'll do it anyway since it relates to the Wii U.

Can the Wii U play Wii,Gamecube,SNES,NES, and other Nintendo console games? Really stupid question I know but i haven't payed much attention to the Wii U since I lost interest in Nintendo with the release of the WiiU.

If the answer to my question is yes than I have another question.

I'm somebody who has never been into Nintendo,Mario,Zelda,etc. Since all these new Mario games and this new Zelda HD is coming out I've felt a need to tryout these Nintendo Classics. I'm trying to capture the entire experience of every Nintendo Franchise, especially Zelda and Mario. Would I be able to play most if not all of the games for each franchise with just the Wii U and is the Nintendo Franchises good enough to spend $400-500 dollars on if I've never played a SINGLE one of them.

Backwards compatible with Wii only. The others likely through Virtual Console (due this summer). Zelda Windwaker (my favorite) is currently being remade in HD (there is a thread for that). If you have never played those games, yes, they warrant the purchase. On the other hand, you do not know if they "are for you". Maybe you should just visit a friend with a Wii(U) and play some Mario Galaxy, check out Zelda etc before purchasing. But quality wise, sure, the software is worth it. You just have to find out if you dig the gameplay and style.

Edit; wait, you're only 16 years old? Chances are you should probably start with the Metroid Prime trilogy and perhaps Zelda Twilight Princess... Unless you think you're man enough to withstand the amount of kiddieness that oozes from Windwaker which has the potential to turn you into either a 5 year old or a girl. This game is only for real men.
 
It wouldn't make no sense for Nintendo to do this considering the bandwidth requirements of shaders, I think devs chose to do straight ports, these were small teams, that spent more time on Gamepad ideas. Then it would have been pointless to put 32mb of eDRAM, and keep the bus small.

Almost as pointless as including 2 GB of RAM but then crippling it with a 64-bit bus?

Plainly, cost is the deciding factor. If integrating the eDRAM with a wider bus reduced yields or cost more in R&D, Nintendo likely passed. As it is, even the lowest configuration gives them "good enough" results and easily provides Wii BC.
 

ikioi

Banned
I doubt intergrating the eDRAM into the GPU would have been cheaper then doubling the MEM2 bus.

eDRAM is quite expensive and effects yields. I can't see how this could have been a cheaper option for Nintendo vs either with no eDRAM and faster MEM2 bus, or smaller off GPU die eDRAM.

Why do you think it was cheaper?
 
Top Bottom