• Register
  • TOS
  • Privacy
  • @NeoGAF
  • Like

SpoonyBard
Banned
(01-23-2013, 06:59 AM)

Originally Posted by OrbitingCow

Seriously, No gamepad, pack in pro controller, and beef specs for this system by 2x and I would have been there launch day. Sigh.

Honestly, that sounds boring as hell. There are other systems for people like you.
Margalis
Banned
(01-23-2013, 07:03 AM)

Originally Posted by ikioi

I'll accept that critisism. I really did fail to clarify wtf i was saying.

So yep i'll cop that.

Yeah no, sorry. The problem isn't that you didn't "clarify" what you were saying, the problem is that what you said was nonsense, then in an attempt to justify it "clarified" it by completely changing the meaning of what you said, including changing the meaning of specific technical jargon in order to save face and justify your console warrior ranting. And the final point you ended up at has essentially nothing at all to do with the original point in your rant.

The fact that 360 can do super fast ROP stuff is cool. Yay. Too bad resolving to main memory is itself a huge bottleneck in part because the same design that gives you "super high bandwidth mumble mumble" involves passing through two lower-bandwidth bottlenecks.

You say you find a WiiU developer who hasn't complained about this or that? Find a 360 developer who hasn't complained about resolving textures to main memory, tiling, etc.

You accuse people of being "fanbois" but looking at only the positives of one system and only the negatives of another sure looks like fanboyism to me.

An actual analysis shows that there are a lot of pros and cons to the 360 setup, not the "lolz WiiU slow eDRAM 360 fast eDRAM lolz" nonsense you are putting out.

Your posts are full of errors of all kinds, from basic repeated math errors to constant typos and improper capitalization. You use words one way then in your next post use them a different way - neither being correct. You are obviously hastily typing up rants then later trying to justify them while filling your posts with hyperbole and polemics. It's tiresome to read and you aren't convincing anyone of anything other than that you are for some reason very emotionally invested in this.

If you want to try your hand at honest analysis feel free. If you want to continue to produce silly nonsense wrapped in shoddy technical arguments featuring the constant failure to correctly multiply numbers together feel free not to bother. Please.

Learn the difference between analysis and advocacy.
Last edited by Margalis; 01-23-2013 at 07:14 AM.
blu
Wants the largest console games publisher to avoid Nintendo's platforms.
(01-23-2013, 08:04 AM)
blu's Avatar
As much as I hate doing this..

Originally Posted by ikioi

You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.

Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.

To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.

Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.



There's no smoke without fire. In the case of the Wii U's cpu, the house is well and truely alight We've seen Dice slam it, Crytech slam it, unnamed sources months ago slam it, even developers publically comment on how it was an obstacle they had to work around.

There's also no denying the CPU is based on decade plus old IBM PPC 750 architecture, and has the transistor count of a single Intel atom core. It also has an incredibly low TDP.




Technical knowledge from a small time indy developer who has made one simple small game for the e-store. They also just so happen to only make games for Nintendo hardware.

Yeah totally indicative of the Wii U's performance and an unbais source.

Also you're just as full of the rhetric as anyone else here. You come into this thread criticising people for their arugments and views, yet offer none of your own.

Pal, no offense, but your posts (like the one above) have contributed nothing to this thread. On the contrary, they've brought the level of discussion down several notches.

You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.

You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.

You end up your posts with 'Fuck you, console vendor X'.

Have you considered the possibility you might be in the wrong thread, as per your current state of mind?
Last edited by blu; 01-23-2013 at 08:07 AM.
Kenka
Member
(01-23-2013, 08:14 AM)
Kenka's Avatar
If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.
nordique
Member
(01-23-2013, 08:23 AM)
nordique's Avatar

Originally Posted by Margalis

The connection between the 360 eDRAM parent and daughter die is 32GB/s, the connection between parent and main memory is 22 GB/s - which you have to do through if you are going to do something useful with what you wrote to eDRAM.

The GPU cannot read from eDRAM, the read bandwidth there is a whopping zero.

Calling it "fast" and "high bandwidth" and the WiiU implementation you suggest slow and low-bandwidth is both an apples to oranges comparison and a purely semantic argument.

I'm pretty sure nobody has used RAM "slowness" to mean "throughput to ROPs" ( you mean "from" ROPS?) until your elaborate backtrack just now.

Before this elaborate clarification you said:



Now you seem to be saying that you meant write-only bandwidth from ROPs instead, which I'm pretty sure cannot really be called a "bus" at all, certainly not more so than two other relevant actual busses.


Just to potentially add to this, and in general, regarding the "on-paper" 360 bandwidth that is brought up against the Wii U, its important to consider that the 360 bandwidth seems to be limited from theoretical performance to 10.8GB/s (aggregated upstream/downstream bandwidth, or so the tech speak results from google searches tell me) in real world wrote situations. I don't know if any 360 developer here can confirm that but I also recall reading in the past during the heyday of hd console discussion that the real world performance of the two hd "twin" wasn't exactly what the spreadsheets said.

That is, Many of the on paper specs from the 360 and PS3 were really bloated compared to real world performance.
nordique
Member
(01-23-2013, 08:29 AM)
nordique's Avatar

Originally Posted by Kenka

If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

It should be possible to port a game across all three, and all "next gen engines" should be fairly scalable. Whether a game goes to wii u is less dependant on the tech side and more so on the publishers dime - I mean dead rising was ported to the Wii because capcom decided so.

If you're concerned about the games coming out influencing a sole wii u purchase I would suggest you wait until the other two are revealed and decide for yourself.
Kenka
Member
(01-23-2013, 08:38 AM)
Kenka's Avatar

Originally Posted by nordique

It should be possible to port a game across all three, and all "next gen engines" should be fairly scalable. Whether a game goes to wii u is less dependant on the tech side and more so on the publishers dime - I mean dead rising was ported to the Wii because capcom decided so.

If you're concerned about the games coming out influencing a sole wii u purchase I would suggest you wait until the other two are revealed and decide for yourself.

Yes, this seems wiser. It's a bit of a bummer that the entire future library could be solely built on good relationships between Nintendo and third parties, since Nintendo doesn't care much anyway.
Durante
Come on down to Durante's drivethru PC port fixes. 15 minutes or less. Yelp: ★★★★★

Fixed Souls, Deadly Premonition, Lightning Returns, Umihara Kawase, Symphonia, Little King's Story, PhD, likes mimosas.
(01-23-2013, 08:49 AM)
Durante's Avatar

Originally Posted by Kenka

If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

You want a simple answer to a complex issue.

Of course it's possible to create a game targeting all 3 platforms. However, it's also possible to create a game targeting everything from 3DS to PS4.

Of course it's also possible to create a game that targets PS4 and 720, and is extremely hard to port down to Wii U.

It's not a yes/no question, it's very much a sliding scale. The only thing anyone can do to provide a general answer is estimate the relative difficulty of multi-platform development.

And anyway, if your question is really "will these games come to Wii U", it may actually be at least as important to look at sales threads as it is to look at this thread.
Last edited by Durante; 01-23-2013 at 08:57 AM.
ozfunghi
Member
(01-23-2013, 09:14 AM)
ozfunghi's Avatar

Originally Posted by Kenka

If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

That answer is open to interpretation i think. Without wanting to come across as presumtious, because i'm not the guy with the info, i think that will almost always depend on the game, the developer and the publisher, in more ways than that the actual hardware would be a clear cut case of being able to get these ports or not.

So the hardware is weaker, obviously games that are going to be pushing the other systems, or rely in such a way on certain strengths of that hardware, are not easily ported down. Should Wii have gotten a Dead Rising port? Should Xbox have gotten a Half-Life 2 port? Should PS360 have gotten Far Cry 3? Obviously, anything "can" be ported, the question is, should it be ported under any and all conditions. There will be cames that "should" not be ported to WiiU, for sure. But i think many games "could" be ported as well, meaning there should be no real obstacle to do so other than somewhat downsized graphics. But is the developer up for it. I have a feeling if you'd ask the guys over at Arkam's or Lherre's studio they seem rather sceptical and/or dismissive. While if you'd ask the guys from Shin'en, i bet they'd say it is possible in many cases, given the proper resources. And yet, if publishers now aren't even considering WiiU for CURRENT generation games regardless of hardware constraints, well, how much hope should we have for the future?
tipoo
Banned
(01-23-2013, 12:56 PM)

Originally Posted by Donnie

Well we have the full hardware diagram for XBox 3 which describes some pretty small details yet there's no mention of a DSP. Also sound may be described as trivial for modern PC CPU's such as Phenom II and Intel I series, but Jaguar isn't in that league. Wouldn't surprise me if audio could take up as much as a whole Jaguar core or close enough.

Xbox3's Jaguar CPU is clocked 29% faster, but its got a much longer pipeline, 14 stages compared to Espresso's 4, not saying that gives WiiU's CPU the advantage core to core or even makes it 100% as fast (we really can't compare exactly). But it should help to counteract the lower clock speed.

One whole jaguar core for sound doesn't strike me as right, Jaguar isn't as powerful as Bulldozer or Ivy Bridge but if it takes a few percentage points of one of those cores to use, I can't imagine it taking 100% of a Jaguar core. It's slower, but not by so much.

And if it did, why wouldn't they include a DSP which is so tiny and cheap to produce?

EDIT: According to this, the nextbox does have audio acceleration

http://www.eurogamer.net/articles/df...box-specs-leak
Last edited by tipoo; 01-23-2013 at 01:54 PM.
ScepticMatt
Member
(01-23-2013, 02:08 PM)
ScepticMatt's Avatar
WiiU DDR3 bandwidth is 12.8 GB/s (assuming 1.600 GHz data rate x 64Bit Interface /8 bit per byte)

On the topic of VLIW (Wii U) vs non-VLIW-SIMD (Durango/Orbis) Architecture differences

Originally Posted by Anandtech

AMD Graphics Core Next: Out With VLIW, In With SIMD

The fundamental issue moving forward is that VLIW designs are great for graphics; they are not so great for computing. [...]

The principal issue is that VLIW is hard to schedule ahead of time and there’s no dynamic scheduling during execution, and as a result the bulk of its weaknesses follow from that. As VLIW5 was a good fit for graphics, it was rather easy to efficiently compile and schedule shaders under those circumstances. With compute this isn’t always the case; there’s simply a wider range of things going on and it’s difficult to figure out what instructions will play nicely with each other. Only a handful of tasks such as brute force hashing thrive under this architecture.

Furthermore as VLIW lives and dies by the compiler, which means not only must the compiler be good, but that every compiler is good. This is an issue when it comes to expanding language support, as even with abstraction through intermediate languages you can still run into issues, including issues with a compiler producing intermediate code that the shader compiler can’t handle well

In short, for VLIW scheduling (deciding which instructions are to be executed when) needs to be done by the compiler, while non-VLIW allows dynamic sheduling during runtime, so it can be used more effiently especially for compute applications
Last edited by ScepticMatt; 01-23-2013 at 02:11 PM.
TheGuardian
Member
(01-23-2013, 02:14 PM)
TheGuardian's Avatar
Thought I could contribute something to this thread that isn't about eDRAM or GPGPUs :P

One specific area where the WiiU can potentially (game dependent of course) gain a bit of performance compared to the 360 and PS3 is skinning/animation.

Some rough background info for those not familiar with it. Animation is in 90% of cases done using joints/bones. These consist of transformation (ie. rotation) information set within a hierarchical structure, called a skeleton. Being hierarchical means that if you move the parent, all the children move along with it. Rotate the elbow and the arm, hand, fingers all go together.
Animators manipulate this skeleton into various animation frames that are interpolated and played back at runtime.

Skinning is the process where you take the joint data and the 3D model, and transform each vertex (3D point that are linked together to define polygons) in the model by the joints that affect it. Vertices in the hand are affected by the hand joints, etc.

This is a relatively expensive process, more expensive the more joints influence a vertex. Many games limit the number of influences to 4.

Now, on to the platform specific part :)

A generic way to do the skinning, is to use the vertex shader. A joint palette is sent to the shader, together with the influence weights and the rest of the normal shader data. The GPU then crunches everything together to produce the final vertex position.
The problem with this method comes with the fact that you usually need to render the same model more than once. For example, for rendering into the shadow map, rendering into a GBuffer, etc. That means the GPU has to do the skinning more than once, which is a bit wasteful.

On the PS3 for example, you have an different option. You can use the SPUs to do the skinning very efficiently, store the resulting data, and pass that to the GPU. While this is great for the GPU, especially if you are vertex bound, it requires extra RAM to store the data. Depending on the game, you might not be able to spare that memory.

On the WiiU, you can use the GPU's ability to stream out data to export the skinning data in the same way as the SPUs on the PS3, and reuse it later. The advantage on the WiiU is that you have a lot more memory, so storing this data should be easy enough for most cross platform titles.

Sorry for the mega post, hope this is interesting to someone.
Donnie
Member
(01-23-2013, 02:25 PM)

Originally Posted by tipoo

One whole jaguar core for sound doesn't strike me as right, Jaguar isn't as powerful as Bulldozer or Ivy Bridge but if it takes a few percentage points of one of those cores to use, I can't imagine it taking 100% of a Jaguar core. It's slower, but not by so much.

And if it did, why wouldn't they include a DSP which is so tiny and cheap to produce?

EDIT: According to this, the nextbox does have audio acceleration

http://www.eurogamer.net/articles/df...box-specs-leak

I don't think these cores really compare to Bulldozer or Ivy Bridge to be honest. One Bulldozer core is about the size of 5 Jaguar cores (well the same size as 7 but there's a process difference to take into account so 5 is more accurate). As far as audio processing goes it can be surprising how much it can hit a CPU. I remember it being argued that no way would audio even use a significant portion of a 3.2Ghz Xenon core, in reference to talk of WiiU's DSP. Until a MS engineer confirmed it often takes one entire core to process audio. Now I'm not saying audio processing would take an entire Jaguar core, just that it wouldn't surprise me if it could take close to that, these aren't powerful cores after all.

You're right, that article does mention audio codecs, though I was under the impression that was for Kinect? I suppose we'll have to wait and see to what extent its used within the system.

If they haven't included a DSP for games then I'd guess the reason would be because they have 8 CPU cores there. They could have included a DSP for 360 but they had a triple core CPU so decided to let that do everything.
Last edited by Donnie; 01-23-2013 at 03:10 PM.
Donnie
Member
(01-23-2013, 02:32 PM)

Originally Posted by Fourth Storm

I'm gonna throw a number out there. 70.4 GB/s This, I believe, is the bandwidth from the Wii U GPU to its eDRAM.

Earlier in the thread, UX8GD eDRAM from NEC/Renesas was brought up as a leading (almost surefire) candidate for the Wii U GPU's on-chip memory. It comes in several configurations, but one now strikes me as most likely: 4 x 8MB macros, each with a 256-bit bus. What makes this interesting is that very early on there was talk of the first Wii U dev kits containing and underclocked RV770LE. This news got our hopes up, but 640 shaders and the like are now out of the question and indeed, never seemed to be in the cards. Why wouldn't they have just used a weaker, smaller, and cooler pc part then (I'm assuming the report was true)? Well, the bandwidth of that card to its on board GDDR3 just happens to be 57.6 GB/s. What's the bandwidth of eDRAM on a 1024-bit bus at 450 Mhz (the reported clock of Wii U dev kits up until late 2011)? 57.6 GB/s. I know it's been hypothesized before, but it seems increasingly likely those first dev kits utilized that particular Radeon to simulate the bandwidth to Wii U's MEM1. Since they upped the speed to 550 Mhz, it should now clock in at 70.4 GB/s for better or worse.

Wasn't WiiU's GPU originally 400Mhz? Sorry to ruin the hypothesis a bit but I do remember it being mentioned that the system was originally 400Mhz GPU and 1GHZ CPU.
Last edited by Donnie; 01-23-2013 at 02:38 PM.
ozfunghi
Member
(01-23-2013, 02:52 PM)
ozfunghi's Avatar
Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol
tipoo
Banned
(01-23-2013, 03:07 PM)

Originally Posted by ozfunghi

Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

You mean this?

http://www.youtube.com/watch?feature...&v=K40orKpbJBU

It does look pretty good, but still nothing that would be mind blowing on the HD 7th gen consoles imo. The textures and geometry are obviously far more than what the Wii could do, but that's about all I could say.
tipoo
Banned
(01-23-2013, 03:15 PM)

Originally Posted by Donnie

I don't think these cores really compare to Bulldozer or Ivy Bridge to be honest. One Bulldozer core is about the size of 5 Jaguar cores (well the same size as 7 but there's a process difference to take into account so 5 is more accurate). As far as audio processing goes it can be surprising how much it can hit a CPU. I remember it being argued that no way would audio even use a significant portion of a 3.2Ghz Xenon core, in reference to talk of WiiU's DSP. Until a MS engineer confirmed it often takes one entire core to process audio. Now I'm not saying audio processing would take an entire Jaguar core, just that it wouldn't surprise me if it could take close to that, these aren't powerful cores after all.

You're right, that article does mention audio codecs, though I was under the impression that was for Kinect? I suppose we'll have to wait and see to what extent its used within the system.

If they haven't included a DSP for games then I'd guess the reason would be because they have 8 CPU cores there. They could have included a DSP for 360 but they had a triple core CPU so decided to let that do everything.


That's true, but I suspect Jaguar is still much better core for core than Xenon even at half the clock speed, Xenon had some pretty big inefficiencies (remember that was during the Pentium 4 era). And from what I understood of the article, the sound accelerator in Durango would handle noise cancellation for Kinect as well as other audio encoding/decoding.

By the way, was it ever settled whether audio on 360 was taking one physical core, or just one thread of which each core has two of? And even with 8 cores it seems like Microsoft wouldn't want to waste 1/8th of its power just doing audio when a piece of silicon that costs so little and draws so little power could do the same. SoCs even have them integrated on-chip, they're so trivially small now.
phosphor112
Banned
(01-23-2013, 03:17 PM)
phosphor112's Avatar
I wanted to come and post how awesome the new games looked.



pestul
Member
(01-23-2013, 03:18 PM)
pestul's Avatar

Originally Posted by ozfunghi

Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

I don't think it's spectacular, but the open-world is very promising. Also the fact that it's probably an online multiplayer game with those visuals is quite impressive.
phosphor112
Banned
(01-23-2013, 03:23 PM)
phosphor112's Avatar
It's not anything too fantastic, but you can tell there is a fidelity increase over PS3 and 360. I don't know though. Xenoblade looked freaking awesome on Wii, and we know how weak that device is. Maybe it's partly art direction? Looks great anyway.
Van Owen
Banned
(01-23-2013, 03:23 PM)
Van Owen's Avatar

Originally Posted by ozfunghi

Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

It looks like it could be done on 360.

Not to say it looks bad per say.
FLAguy954
Junior Member
(01-23-2013, 03:24 PM)
FLAguy954's Avatar

Originally Posted by phosphor112

I wanted to come and post how awesome the new games looked.



That shore on the second gif needs some work. Something positive I noticed about the second gif, however, is how the distant areas are clear and easily viewable, a good sign I say.
lherre
Accurate
(01-23-2013, 03:25 PM)
lherre's Avatar
Sound in MS and Sony next consoles won't be managed by the cpu's.
Donnie
Member
(01-23-2013, 03:25 PM)

Originally Posted by tipoo

That's true, but I suspect Jaguar is still much better core for core than Xenon even at half the clock speed, Xenon had some pretty big inefficiencies (remember that was during the Pentium 4 era). And from what I understood of the article, the sound accelerator in Durango would handle noise cancellation for Kinect as well as other audio encoding/decoding.

By the way, was it ever settled whether audio on 360 was taking one physical core, or just one thread of which each core has two of? And even with 8 cores it seems like Microsoft wouldn't want to waste 1/8th of its power just doing audio when a piece of silicon that costs so little and draws so little power could do the same. SoCs even have them integrated on-chip, they're so trivially small now.

Jaguar could very well be faster in many ways than Xenon core for core even at 1.6Ghz. Though I don't think its going to blow it away or anything.

It was definitely one core, though I think he said that some much less complex games got away with one thread.

I completely agree that a DSP sounds the right way to go, hopefully they have gone that way for their own sake, because IMO its a waste not to.
Donnie
Member
(01-23-2013, 03:31 PM)

Originally Posted by lherre

Sound in MS and Sony next consoles won't be managed by the cpu's.

So they did learn a lesson. Any info on how the OS effects the CPU's in these consoles? As in the claim that 2 cores are set aside for OS and Apps in XBox 3?
ozfunghi
Member
(01-23-2013, 03:41 PM)
ozfunghi's Avatar

Originally Posted by Van Owen

It looks like it could be done on 360.

Not to say it looks bad per say.

Yeah, i'm specifically talking about the scope, i don't know if there is a 360 game that pushes the same scale with that level of detail (not saying there isn't, just that i don't know).

PS: don't want to act like a douche, but you're the 3rd person i've seen posting it this week. It's not "per say", it comes from latin, it's "per se" (in/by itself). It specifally comes off strange since i'm not a native English speaker, yet we also use the expression "per se" in our language, so it stands out more i guess when reading it spelled like that.

Originally Posted by phosphor112

I wanted to come and post how awesome the new games looked.

Blu will have your blood.


Originally Posted by phosphor112

Why?

Posting screens/gifs inevitably lead to other people showing other pictures of other games on other hardware to compare and a flame fest follows. Come to think of it, maybe he didn't like me mentioning the game at all either, lol.
Last edited by ozfunghi; 01-23-2013 at 04:03 PM.
phosphor112
Banned
(01-23-2013, 03:43 PM)
phosphor112's Avatar

Originally Posted by ozfunghi

Blu will have your blood.

Why?
tipoo
Banned
(01-23-2013, 04:03 PM)

Originally Posted by ozfunghi

Yeah, i'm specifically talking about the scope, i don't know if there is a 360 game that pushes the same scale with that level of detail (not saying there isn't, just that i don't know).

I can't think of one at the moment, but GoW 3 on PS3 comes to mind. If you can really fly to any point in the horizon you see there that is huge though.
phosphor112
Banned
(01-23-2013, 04:07 PM)
phosphor112's Avatar

Originally Posted by tipoo

I can't think of one at the moment, but GoW 3 on PS3 comes to mind. If you can really fly to any point in the horizon you see there that is huge though.

Just Cause 2? There is a lot more in terms of models and things on screen though.

https://www.youtube.com/watch?v=k_C6TjwxbGI
Fourth Storm
Member
(01-23-2013, 04:47 PM)
Fourth Storm's Avatar

Originally Posted by Donnie

Wasn't WiiU's GPU originally 400Mhz? Sorry to ruin the hypothesis a bit but I do remember it being mentioned that the system was originally 400Mhz GPU and 1GHZ CPU.

Doesn't ruin it really, as the bandwidth of the Radeon card would be used as a target. We know the early dev kits had to be clocked below target because of overheating.
OG_Original Gamer
Member
(01-23-2013, 05:09 PM)
So, I guess we can confirm the eDRAM bandwidth is XXXGB/s.
Fourth Storm
Member
(01-23-2013, 05:36 PM)
Fourth Storm's Avatar

Originally Posted by OG_Original Gamer

So, I guess we can confirm the eDRAM bandwidth is XXXGB/s.

No.
OG_Original Gamer
Member
(01-23-2013, 07:11 PM)

Originally Posted by Fourth Storm

No.

Well, If Shinen says it is, or suggest it is, then its enough for me, its just them having dev kits that leads me believe them
NBtoaster
Member
(01-23-2013, 07:22 PM)
NBtoaster's Avatar

Originally Posted by phosphor112

I wanted to come and post how awesome the new games looked.



Background detail looks much better than something like Skyrim. Putting that RAM to use.
OG_Original Gamer
Member
(01-23-2013, 07:28 PM)
That's putting that eDRAM to good use.
NBtoaster
Member
(01-23-2013, 07:42 PM)
NBtoaster's Avatar

Originally Posted by OG_Original Gamer

That's putting that eDRAM to good use.

I doubt much geometric detail would be in eDRAM, that'll be for the framebuffer, shadows, maybe a few textures..

Though texture detail in the distance seems good too.
Last edited by NBtoaster; 01-23-2013 at 07:50 PM.
OG_Original Gamer
Member
(01-23-2013, 07:51 PM)

Originally Posted by NBtoaster

I doubt much geometric detail would be in eDRAM, that'll be for the framebuffer, shadows, maybe a few textures..

Though texture detail in the distance seems good too.

Don't forget shaders.
Zoramon089
Banned
(01-23-2013, 07:59 PM)
Zoramon089's Avatar

Originally Posted by Van Owen

It looks like it could be done on 360.

Not to say it looks bad per say.

Well unless you have a 360 game in mind that looks as nice with that scale (and really few games this gen have this sort of draw distance, except for probably AC, RDR and Skyrim and this looks better than those), I'm not sure it could


Originally Posted by Fourth Storm

No.

Well it's your word versus AN ACTUAL WII U DEVELOPER. I wonder whose has more weight (well, really, any weight)?
Fourth Storm
Member
(01-23-2013, 08:00 PM)
Fourth Storm's Avatar

Originally Posted by OG_Original Gamer

Well, If Shinen says it is, or suggest it is, then its enough for me, its just them having dev kits that leads me believe them

Where did they suggest it is?
guek
Banned
(01-23-2013, 08:09 PM)
guek's Avatar
Here's the thing: even if the Wii U is a step up above 360/PS3, it's not far ahead enough to look like anything close to a "next gen leap". So people like Van Owen will always be able to say "eh, doable on last gen" without anything other than casual observance as their evidence.

Expect to hear it a lot in the coming years, regardless of how Wii U games look. You could have something like 1313 locked at 30fps/720p/no AA and people would say "meh, doable on 360" because hey, it wouldn't be far from the truth.
OG_Original Gamer
Member
(01-23-2013, 09:18 PM)

Originally Posted by Fourth Storm

Where did they suggest it is?

Can't remember the thread title, there may be a link in this thread. But, its an interview done by Ideaman, particularly about the WiiU memory architecture the dev or devs reveal eDRAM, bandwidth without being specific. XXXGB/S being the non specific answer. This was a response to questions about DDR3 low bandwidth.
Fourth Storm
Member
(01-23-2013, 10:06 PM)
Fourth Storm's Avatar

Originally Posted by OG_Original Gamer

Can't remember the thread title, there may be a link in this thread. But, its an interview done by Ideaman, particularly about the WiiU memory architecture the dev or devs reveal eDRAM, bandwidth without being specific. XXXGB/S being the non specific answer. This was a response to questions about DDR3 low bandwidth.

I think you're confusing Ideaman's own speculations with what the developer actually said. I've read both the NES interview with Shin'en and the "Not Enough Bandwidth" articles by IM and nowhere does the developer allude to XXXGB/s bandwidth. That doesn't mean it's not true - we really don't know yet. But I suspect it's lower.
Zoramon089
Banned
(01-23-2013, 10:24 PM)
Zoramon089's Avatar

Originally Posted by Fourth Storm

I think you're confusing Ideaman's own speculations with what the developer actually said. I've read both the NES interview with Shin'en and the "Not Enough Bandwidth" articles by IM and nowhere does the developer allude to XXXGB/s bandwidth. That doesn't mean it's not true - we really don't know yet. But I suspect it's lower.

Actually it was in the Not Enough Bandwidth article, which brings up another good point not addressed here. I noticed in comments people were calling out the claim that the Wii U only had 1 RAM bus when both the GC and Wii had 2. Was this just speculation or was this something seen in an actual "breakdown" (or whatever partial breakdown they've done)?
Fourth Storm
Member
(01-23-2013, 10:37 PM)
Fourth Storm's Avatar

Originally Posted by Zoramon089

Actually it was in the Not Enough Bandwidth article, which brings up another good point not addressed here. I noticed in comments people were calling out the claim that the Wii U only had 1 RAM bus when both the GC and Wii had 2. Was this just speculation or was this something seen in an actual "breakdown" (or whatever partial breakdown they've done)?

From what I've read of those comments, it sounds like angry fanboys in denial. It's amazing what cognitive dissonance can do. Gamecube had an additional bus to its ARAM, which was pretty much slow as molasses. Wii had a bus on the Hollywood MCM to the 24 MB 1t-SRAM and another bus to its GDDR3. Wii U has no need for an additional external bus as it contains one unified pool of DDR3. The 32 MB eDRAM is on-chip. The internal bandwidth from the GPU to that pool is still a mystery, but I have suggested 70.4 GB/s.

Edit: I just read those comments again. Wow. The posters do no seem to grasp that each RAM chip carries a 16-bit interface. Anything more than a single 64-bit channel is impossible with that setup.
Last edited by Fourth Storm; 01-23-2013 at 10:43 PM.
ikioi
Banned
(01-23-2013, 11:07 PM)

Originally Posted by blu

You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.

I explained how. I'm so use to dealing with PC memory that i stupidly input 64bit per module, rather then 16bit. I'm so use to dealing with DDR3 on a 64bit bus.

You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.

No i didn't. I'm not even sure where this came from

Care to quote a post of mine where i said the ROPs are tied to the DDR3 and not the eDRAM?

What i said was unlike the Xbox 360 which had the ROPs and eDRAM intergrated into one die with an incredibly high bandwidth between the two. I believe the Wii U's eDRAM is on a slower bus, intergrated directly within the GPU die, but it has far higher overall bandwidth to the remainder of the GPU like its shaders. The DDR3 in the Wii U is also quite slow. As such there would be no justifable reason for Nintendo and AMD to include more then 8 ROPs in the Wii U's GPU.

That is what i said. I have NFI where this 'ROPs tied to DDR3' bullshit comes from. I never said that. In fact what i said about the DDR3 was due to its slow bandwidth there's no way it could be used to feed ROPs. So i said the complete opposite of what you claim. I looked at the DDR3 and ruled it out as being usable for the ROPs.


Have you considered the possibility you might be in the wrong thread, as per your current state of mind?

Absolutely. The Wii U's eDRAM could be insanely faster then my sub <100 gigabyte per second belief. Resena do seem to have the tech to do faster, a lot faster in fact. But if that was the case why have we seen no multi platform games take advantage of the larger eDRAM pool to offer improved AA and AF on Wii U versions? I've seen nothing to date that suggests the Wii U's eDRAM has data throughput to its ROPs that is similar to the Xbox 360s.

Originally Posted by Margalis

And the final point you ended up at has essentially nothing at all to do with the original point in your rant.

As per the above.

My argument was that the Wii U's GPU appears to have 8 ROPs. I based this on my belief that the Wii U's eDRAM pool isn't as fast as the Xbox 360's pool was to its ROPs. Over all the Wii U's eDRAM pool is likely faster to the remainder of the its GPU, but slower to the ROPs. Also the 64bit DDR3 is also slow, which again hurts bandwidth and means adding extra ROPs would be pointless anyway. ROPs require bandwidth to work, lots of it. I don't believe the Wii U's eDRAM or DDR3 pool can provide enough bandwidth to justify more then 8 ROPs.

An actual analysis shows that there are a lot of pros and cons to the 360 setup, not the "lolz WiiU slow eDRAM 360 fast eDRAM lolz" nonsense you are putting out.

Of course there are, i never said the Xbox 360 was some gem of engineering and efficiency.
Last edited by ikioi; 01-23-2013 at 11:27 PM.
CoolMarquis97
Banned
(01-23-2013, 11:36 PM)
Don't know if this is the right place to ask a question but I'll do it anyway since it relates to the Wii U.

Can the Wii U play Wii,Gamecube,SNES,NES, and other Nintendo console games? Really stupid question I know but i haven't payed much attention to the Wii U since I lost interest in Nintendo with the release of the WiiU.

If the answer to my question is yes than I have another question.

I'm somebody who has never been into Nintendo,Mario,Zelda,etc. Since all these new Mario games and this new Zelda HD is coming out I've felt a need to tryout these Nintendo Classics. I'm trying to capture the entire experience of every Nintendo Franchise, especially Zelda and Mario. Would I be able to play most if not all of the games for each franchise with just the Wii U and is the Nintendo Franchises good enough to spend $400-500 dollars on if I've never played a SINGLE one of them.
SmokeMaxX
Member
(01-23-2013, 11:45 PM)
SmokeMaxX's Avatar

Originally Posted by CoolMarquis97

Can the Wii U play Wii,Gamecube,SNES,NES, and other Nintendo console games? Really stupid question I know but i haven't payed much attention to the Wii U since I lost interest in Nintendo with the release of the WiiU.

If the answer to my question is yes than I have another question.

I'm somebody who has never been into Nintendo,Mario,Zelda,etc. Since all these new Mario games and this new Zelda HD is coming out I've felt a need to tryout these Nintendo Classics. I'm trying to capture the entire experience of every Nintendo Franchise, especially Zelda and Mario. Would I be able to play most if not all of the games for each franchise with just the Wii U and is the Nintendo Franchises good enough to spend $400-500 dollars on if I've never played a SINGLE one of them.

Can natively play Wii games. Can't natively play the others. Virtual console allows you to play SNES/NES games (as well as Gameboy, Gameboy advance) and I think N64 games. I'm sure Gamecube games will come along as well.

Nintendo games should definitely be played and I'm sure you can get a good experience of most games on it. There are certain franchises that are "meh" on the console (Pokemon, for one). Not sure if they've released it to VC but I doubt it. There are some experiences that are a little different than the original experience (i.e. n64 games).
OG_Original Gamer
Member
(01-23-2013, 11:48 PM)
It wouldn't make no sense for Nintendo to do this considering the bandwidth requirements of shaders, I think devs chose to do straight ports, these were small teams, that spent more time on Gamepad ideas. Then it would have been pointless to put 32mb of eDRAM, and keep the bus small.
ozfunghi
Member
(01-24-2013, 12:06 AM)
ozfunghi's Avatar

Originally Posted by CoolMarquis97

Don't know if this is the right place to ask a question but I'll do it anyway since it relates to the Wii U.

Can the Wii U play Wii,Gamecube,SNES,NES, and other Nintendo console games? Really stupid question I know but i haven't payed much attention to the Wii U since I lost interest in Nintendo with the release of the WiiU.

If the answer to my question is yes than I have another question.

I'm somebody who has never been into Nintendo,Mario,Zelda,etc. Since all these new Mario games and this new Zelda HD is coming out I've felt a need to tryout these Nintendo Classics. I'm trying to capture the entire experience of every Nintendo Franchise, especially Zelda and Mario. Would I be able to play most if not all of the games for each franchise with just the Wii U and is the Nintendo Franchises good enough to spend $400-500 dollars on if I've never played a SINGLE one of them.

Backwards compatible with Wii only. The others likely through Virtual Console (due this summer). Zelda Windwaker (my favorite) is currently being remade in HD (there is a thread for that). If you have never played those games, yes, they warrant the purchase. On the other hand, you do not know if they "are for you". Maybe you should just visit a friend with a Wii(U) and play some Mario Galaxy, check out Zelda etc before purchasing. But quality wise, sure, the software is worth it. You just have to find out if you dig the gameplay and style.

Edit; wait, you're only 16 years old? Chances are you should probably start with the Metroid Prime trilogy and perhaps Zelda Twilight Princess... Unless you think you're man enough to withstand the amount of kiddieness that oozes from Windwaker which has the potential to turn you into either a 5 year old or a girl. This game is only for real men.
Last edited by ozfunghi; 01-24-2013 at 12:11 AM.
Fourth Storm
Member
(01-24-2013, 12:33 AM)
Fourth Storm's Avatar

Originally Posted by OG_Original Gamer

It wouldn't make no sense for Nintendo to do this considering the bandwidth requirements of shaders, I think devs chose to do straight ports, these were small teams, that spent more time on Gamepad ideas. Then it would have been pointless to put 32mb of eDRAM, and keep the bus small.

Almost as pointless as including 2 GB of RAM but then crippling it with a 64-bit bus?

Plainly, cost is the deciding factor. If integrating the eDRAM with a wider bus reduced yields or cost more in R&D, Nintendo likely passed. As it is, even the lowest configuration gives them "good enough" results and easily provides Wii BC.

Thread Tools