• Register
  • TOS
  • Privacy
  • @NeoGAF

oversitting
Banned
(01-23-2013, 06:12 AM)

Originally Posted by Zoramon089

This is an incredibly huge leap in logic. We don't know significant portions of the HW and you're already making hyperbolic statements about the system...i'd say to wait...

Its pretty obvious the gpu is more powerful than the 360. At minimal it should be able to do what the 360 can in terms of graphics. This will of course depend on developer support and how much money and time they take to make the games.
ikioi
Banned
(01-23-2013, 06:13 AM)
Its really hard to guage the Wii U's performance.

We don't know the GPUs specs well enough. ALUs, ROPs, eDRAM bandwidth, etc. Yet alone if its entirely based on the R700 series, or a hybrid of R700 and more modern AMD tech.

The CPU does seem weak, however that has to be put into context. Its not like AMD's Jaguar is powerful by any stretch of the imagination. It seems all three next gen consoles are going with more general purpose lower clocked CPUs, with a focus on high instructions per clock, and moving the heavy SIMD and other work that Cell's SPE and Xenon's cores did to the GPUs.

The Wii U's MEM2 bandwidth also seems weak. Only at 64bit and 12.8 gigabytes per second. But that can be offset some what by increased CPU cache, larger general purpose eDRAM for GPU, and more modern architecture. Its possible Nintendo have been able to engineer a console that has significantly reduced dependancy on its MEM2 pool vs the Xbox 360 and PS3, and a heavy focus on using cache and eDRAM to limit reads and writes to I/O. That said the MEM2 data throughput is still not good, it wouldn't have cost Nintendo much to go for a 128 or even 256bit bus.
shinra-bansho
Definitely Not Y2Kev's Alt Account
(01-23-2013, 06:19 AM)
shinra-bansho's Avatar

Originally Posted by ozfunghi

If that's what they were going for, isn't it strange how much it resembles the concept of Durango? Surely they could have gone for other options, making ports between PS360 and WiiU more easy?

The "concept" of Durango isn't really that far off the concept of the 360 in terms of memory structure, as far as I can tell.

The Wii U was billed as being easy to port from the 360 in early leaks and PR iirc. It really isn't a stretch to imagine ~360 was the target performance level, within a very small profile and power envelope.
Last edited by shinra-bansho; 01-23-2013 at 06:26 AM.
JordanN
Junior Member
(01-23-2013, 06:21 AM)
JordanN's Avatar
Edit: Ok, I see context was added so I see no need to complain.
Last edited by JordanN; 01-23-2013 at 06:28 AM.
lwilliams3
Member
(01-23-2013, 06:25 AM)
lwilliams3's Avatar

Originally Posted by shinra-bansho

The "concept" of Durango isn't really that far off the concept of the 360 in terms of memory structure, as far as I can tell.

The Wii U was billed as being easy to port from the 360 in early leaks and PR iirc.


As with the Gamecube, Wii, and Wii U. Even the PS2 and its eDRAM perhaps? The biggest changes throughout the generation is the speed and size difference between the fast memory bank and the slow one. I suppose the Xbox, PS3 and PS4 are the odder ones.
Zoramon089
Banned
(01-23-2013, 06:26 AM)
Zoramon089's Avatar

Originally Posted by oversitting

Its pretty obvious the gpu is more powerful than the 360. At minimal it should be able to do what the 360 can in terms of graphics. This will of course depend on developer support and how much money and time they take to make the games.

I know. I meant more you seemed to be lowballing it. Expecting only Halo 4 at 720p 30fps...we know the gpu is much stronger. I think we can expect more. Of course, like I said, there are still too many unknowns
Margalis
Banned
(01-23-2013, 06:40 AM)

Also to clarify. When i say the eDRAM is slower then the Xbox 360's, i'm only refering to bandwidth between the eDRAM and the ROPs.

The connection between the 360 eDRAM parent and daughter die is 32GB/s, the connection between parent and main memory is 22 GB/s - which you have to do through if you are going to do something useful with what you wrote to eDRAM.

The GPU cannot read from eDRAM, the read bandwidth there is a whopping zero.

Calling it "fast" and "high bandwidth" and the WiiU implementation you suggest slow and low-bandwidth is both an apples to oranges comparison and a purely semantic argument.

I'm pretty sure nobody has used RAM "slowness" to mean "throughput to ROPs" ( you mean "from" ROPS?) until your elaborate backtrack just now.

Before this elaborate clarification you said:

The eDRAM on the GPU seems be on a slower bus then that of the Xbox 360's.

Now you seem to be saying that you meant write-only bandwidth from ROPs instead, which I'm pretty sure cannot really be called a "bus" at all, certainly not more so than two other relevant actual busses.
Last edited by Margalis; 01-23-2013 at 06:52 AM.
oversitting
Banned
(01-23-2013, 06:40 AM)

Originally Posted by Zoramon089

I know. I meant more you seemed to be lowballing it. Expecting only Halo 4 at 720p 30fps...we know the gpu is much stronger. I think we can expect more. Of course, like I said, there are still too many unknowns

I said it was doable, I did not say it was the absolute max. And I think halo 4 looks very amazing for hardware of 360's caliber.
OrbitingCow
Banned
(01-23-2013, 06:44 AM)
Honestly, and I do love Nintendo, but 30fps isn't going to cut it for me. There is simply no going back in Certain games, when you play 60fps all the time.

If they can get 720p 60fps and 2xMSAA out of Zelda I would bite. If not then what the hell is the point. I am NOT repeating another gen of shitty IQ unless said system is cheap as hell.

Seriously, No gamepad, pack in pro controller, and beef specs for this system by 2x and I would have been there launch day. Sigh.
ikioi
Banned
(01-23-2013, 06:48 AM)

Originally Posted by Margalis

The connection between the 360 eDRAM parent and daughter die is 32GB/s, the connection between parent and main memory is 22 GB/s - which you have to do through if you are going to do something useful with what you wrote to eDRAM.

Correct.

eDRAM - GDD5 - then back. That's the process required.


Calling it "fast" and "high bandwidth" and the WiiU implementation you suggest slow and low-bandwidth is both an apples to oranges comparison and a purely semantic argument.

I'm pretty sure nobody has used RAM "slowness" to mean "throughput to ROPs" until your elaborate backtrack just now.

I'll accept that critisism. I really did fail to clarify wtf i was saying.

So yep i'll cop that.
SpoonyBard
Member
(01-23-2013, 06:59 AM)
SpoonyBard's Avatar

Originally Posted by OrbitingCow

Seriously, No gamepad, pack in pro controller, and beef specs for this system by 2x and I would have been there launch day. Sigh.

Honestly, that sounds boring as hell. There are other systems for people like you.
Margalis
Banned
(01-23-2013, 07:03 AM)

Originally Posted by ikioi

I'll accept that critisism. I really did fail to clarify wtf i was saying.

So yep i'll cop that.

Yeah no, sorry. The problem isn't that you didn't "clarify" what you were saying, the problem is that what you said was nonsense, then in an attempt to justify it "clarified" it by completely changing the meaning of what you said, including changing the meaning of specific technical jargon in order to save face and justify your console warrior ranting. And the final point you ended up at has essentially nothing at all to do with the original point in your rant.

The fact that 360 can do super fast ROP stuff is cool. Yay. Too bad resolving to main memory is itself a huge bottleneck in part because the same design that gives you "super high bandwidth mumble mumble" involves passing through two lower-bandwidth bottlenecks.

You say you find a WiiU developer who hasn't complained about this or that? Find a 360 developer who hasn't complained about resolving textures to main memory, tiling, etc.

You accuse people of being "fanbois" but looking at only the positives of one system and only the negatives of another sure looks like fanboyism to me.

An actual analysis shows that there are a lot of pros and cons to the 360 setup, not the "lolz WiiU slow eDRAM 360 fast eDRAM lolz" nonsense you are putting out.

Your posts are full of errors of all kinds, from basic repeated math errors to constant typos and improper capitalization. You use words one way then in your next post use them a different way - neither being correct. You are obviously hastily typing up rants then later trying to justify them while filling your posts with hyperbole and polemics. It's tiresome to read and you aren't convincing anyone of anything other than that you are for some reason very emotionally invested in this.

If you want to try your hand at honest analysis feel free. If you want to continue to produce silly nonsense wrapped in shoddy technical arguments featuring the constant failure to correctly multiply numbers together feel free not to bother. Please.

Learn the difference between analysis and advocacy.
Last edited by Margalis; 01-23-2013 at 07:14 AM.
blu
Member
(01-23-2013, 08:04 AM)
blu's Avatar
As much as I hate doing this..

Originally Posted by ikioi

You mean like how i talked about the Wii U's MEM2 pool being on a 64bit bus. DDR3 1600 on a 64bit bus. 4 chips, 512 megabyte capcity, all on a 16 bit bus. 16bit x 4 = 64bit. 200mhz base clock x 4 x 16 = 10.2gbs per second of bandwidth. This is in comparison to the next gen consoles which appear to be using 256bit for their main memory pools. The Xbox 360 and PS3 also used 128bit bus for their GDDR3, which still provides more raw bandwidth then the Wii Us. Even with a modern memory controller there's no way the Wii U's ram is on par even in the real world vs the Xbox 360's memory.

Or the likelyhood that the GPU only has 8 ROPs due to the low memory bandwidth of the Wii U. ROPs are bandwidth dependant, there's no point adding more ROPs unless you can feed them data fast enough.

To which i expanded on by using the XBox 360 as an example. With the Xbox 360 the ROPs were intergrated into the eDRAM. Due to this configuration the ROPs could be fed data at around 256 gigabytes per second. The Wii U's eDRAM implamentation does not seem to be similar to this, with its bus being considerably slower.

Or the fact the CPU is the size of a single Intel ATOM core and has an incredibly low TDP. It's also based on the decade old IBM PPC 750 based architecture. Its performance is going to be anything but stellar.



There's no smoke without fire. In the case of the Wii U's cpu, the house is well and truely alight We've seen Dice slam it, Crytech slam it, unnamed sources months ago slam it, even developers publically comment on how it was an obstacle they had to work around.

There's also no denying the CPU is based on decade plus old IBM PPC 750 architecture, and has the transistor count of a single Intel atom core. It also has an incredibly low TDP.




Technical knowledge from a small time indy developer who has made one simple small game for the e-store. They also just so happen to only make games for Nintendo hardware.

Yeah totally indicative of the Wii U's performance and an unbais source.

Also you're just as full of the rhetric as anyone else here. You come into this thread criticising people for their arugments and views, yet offer none of your own.

Pal, no offense, but your posts (like the one above) have contributed nothing to this thread. On the contrary, they've brought the level of discussion down several notches.

You couldn't bother to correctly compute the BW of a well known RAM & bus configuration, even though you got the basic multipliers right (hint: it's not 10.2GB/s) - how you managed to do that is beyond me.

You come up with the most absurd of ideas that U-GPU's ROPs would be bound to the slower DDR3 pool, and not to the eDRAM pool.

You end up your posts with 'Fuck you, console vendor X'.

Have you considered the possibility you might be in the wrong thread, as per your current state of mind?
Last edited by blu; 01-23-2013 at 08:07 AM.
Kenka
Member
(01-23-2013, 08:14 AM)
Kenka's Avatar
If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.
nordique
Member
(01-23-2013, 08:23 AM)
nordique's Avatar

Originally Posted by Margalis

The connection between the 360 eDRAM parent and daughter die is 32GB/s, the connection between parent and main memory is 22 GB/s - which you have to do through if you are going to do something useful with what you wrote to eDRAM.

The GPU cannot read from eDRAM, the read bandwidth there is a whopping zero.

Calling it "fast" and "high bandwidth" and the WiiU implementation you suggest slow and low-bandwidth is both an apples to oranges comparison and a purely semantic argument.

I'm pretty sure nobody has used RAM "slowness" to mean "throughput to ROPs" ( you mean "from" ROPS?) until your elaborate backtrack just now.

Before this elaborate clarification you said:



Now you seem to be saying that you meant write-only bandwidth from ROPs instead, which I'm pretty sure cannot really be called a "bus" at all, certainly not more so than two other relevant actual busses.


Just to potentially add to this, and in general, regarding the "on-paper" 360 bandwidth that is brought up against the Wii U, its important to consider that the 360 bandwidth seems to be limited from theoretical performance to 10.8GB/s (aggregated upstream/downstream bandwidth, or so the tech speak results from google searches tell me) in real world wrote situations. I don't know if any 360 developer here can confirm that but I also recall reading in the past during the heyday of hd console discussion that the real world performance of the two hd "twin" wasn't exactly what the spreadsheets said.

That is, Many of the on paper specs from the 360 and PS3 were really bloated compared to real world performance.
nordique
Member
(01-23-2013, 08:29 AM)
nordique's Avatar

Originally Posted by Kenka

If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

It should be possible to port a game across all three, and all "next gen engines" should be fairly scalable. Whether a game goes to wii u is less dependant on the tech side and more so on the publishers dime - I mean dead rising was ported to the Wii because capcom decided so.

If you're concerned about the games coming out influencing a sole wii u purchase I would suggest you wait until the other two are revealed and decide for yourself.
Kenka
Member
(01-23-2013, 08:38 AM)
Kenka's Avatar

Originally Posted by nordique

It should be possible to port a game across all three, and all "next gen engines" should be fairly scalable. Whether a game goes to wii u is less dependant on the tech side and more so on the publishers dime - I mean dead rising was ported to the Wii because capcom decided so.

If you're concerned about the games coming out influencing a sole wii u purchase I would suggest you wait until the other two are revealed and decide for yourself.

Yes, this seems wiser. It's a bit of a bummer that the entire future library could be solely built on good relationships between Nintendo and third parties, since Nintendo doesn't care much anyway.
Durante
I'm taking it FROM here, so says Mr. Stewart
(01-23-2013, 08:49 AM)
Durante's Avatar

Originally Posted by Kenka

If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

You want a simple answer to a complex issue.

Of course it's possible to create a game targeting all 3 platforms. However, it's also possible to create a game targeting everything from 3DS to PS4.

Of course it's also possible to create a game that targets PS4 and 720, and is extremely hard to port down to Wii U.

It's not a yes/no question, it's very much a sliding scale. The only thing anyone can do to provide a general answer is estimate the relative difficulty of multi-platform development.

And anyway, if your question is really "will these games come to Wii U", it may actually be at least as important to look at sales threads as it is to look at this thread.
Last edited by Durante; 01-23-2013 at 08:57 AM.
ozfunghi
Member
(01-23-2013, 09:14 AM)
ozfunghi's Avatar

Originally Posted by Kenka

If the techies cannot answer the question wherever development of next-gen titles can be done on all three Durango, Orbis and WiiU, it's wiser, I think, to rather ask a developer on GAF. Is there a dev who has been open during the past about his job, and that already has experience with all three dev kits ? I assume many of us want to know the answer to this question before they purchase the WiiU. It's tiring to wait for announcements from Nintendo's side.

That answer is open to interpretation i think. Without wanting to come across as presumtious, because i'm not the guy with the info, i think that will almost always depend on the game, the developer and the publisher, in more ways than that the actual hardware would be a clear cut case of being able to get these ports or not.

So the hardware is weaker, obviously games that are going to be pushing the other systems, or rely in such a way on certain strengths of that hardware, are not easily ported down. Should Wii have gotten a Dead Rising port? Should Xbox have gotten a Half-Life 2 port? Should PS360 have gotten Far Cry 3? Obviously, anything "can" be ported, the question is, should it be ported under any and all conditions. There will be cames that "should" not be ported to WiiU, for sure. But i think many games "could" be ported as well, meaning there should be no real obstacle to do so other than somewhat downsized graphics. But is the developer up for it. I have a feeling if you'd ask the guys over at Arkam's or Lherre's studio they seem rather sceptical and/or dismissive. While if you'd ask the guys from Shin'en, i bet they'd say it is possible in many cases, given the proper resources. And yet, if publishers now aren't even considering WiiU for CURRENT generation games regardless of hardware constraints, well, how much hope should we have for the future?
tipoo
Banned
(01-23-2013, 12:56 PM)

Originally Posted by Donnie

Well we have the full hardware diagram for XBox 3 which describes some pretty small details yet there's no mention of a DSP. Also sound may be described as trivial for modern PC CPU's such as Phenom II and Intel I series, but Jaguar isn't in that league. Wouldn't surprise me if audio could take up as much as a whole Jaguar core or close enough.

Xbox3's Jaguar CPU is clocked 29% faster, but its got a much longer pipeline, 14 stages compared to Espresso's 4, not saying that gives WiiU's CPU the advantage core to core or even makes it 100% as fast (we really can't compare exactly). But it should help to counteract the lower clock speed.

One whole jaguar core for sound doesn't strike me as right, Jaguar isn't as powerful as Bulldozer or Ivy Bridge but if it takes a few percentage points of one of those cores to use, I can't imagine it taking 100% of a Jaguar core. It's slower, but not by so much.

And if it did, why wouldn't they include a DSP which is so tiny and cheap to produce?

EDIT: According to this, the nextbox does have audio acceleration

http://www.eurogamer.net/articles/df...box-specs-leak
Last edited by tipoo; 01-23-2013 at 01:54 PM.
ScepticMatt
Member
(01-23-2013, 02:08 PM)
ScepticMatt's Avatar
WiiU DDR3 bandwidth is 12.8 GB/s (assuming 1.600 GHz data rate x 64Bit Interface /8 bit per byte)

On the topic of VLIW (Wii U) vs non-VLIW-SIMD (Durango/Orbis) Architecture differences

Originally Posted by Anandtech

AMD Graphics Core Next: Out With VLIW, In With SIMD

The fundamental issue moving forward is that VLIW designs are great for graphics; they are not so great for computing. [...]

The principal issue is that VLIW is hard to schedule ahead of time and there’s no dynamic scheduling during execution, and as a result the bulk of its weaknesses follow from that. As VLIW5 was a good fit for graphics, it was rather easy to efficiently compile and schedule shaders under those circumstances. With compute this isn’t always the case; there’s simply a wider range of things going on and it’s difficult to figure out what instructions will play nicely with each other. Only a handful of tasks such as brute force hashing thrive under this architecture.

Furthermore as VLIW lives and dies by the compiler, which means not only must the compiler be good, but that every compiler is good. This is an issue when it comes to expanding language support, as even with abstraction through intermediate languages you can still run into issues, including issues with a compiler producing intermediate code that the shader compiler can’t handle well

In short, for VLIW scheduling (deciding which instructions are to be executed when) needs to be done by the compiler, while non-VLIW allows dynamic sheduling during runtime, so it can be used more effiently especially for compute applications
Last edited by ScepticMatt; 01-23-2013 at 02:11 PM.
TheGuardian
Member
(01-23-2013, 02:14 PM)
TheGuardian's Avatar
Thought I could contribute something to this thread that isn't about eDRAM or GPGPUs :P

One specific area where the WiiU can potentially (game dependent of course) gain a bit of performance compared to the 360 and PS3 is skinning/animation.

Some rough background info for those not familiar with it. Animation is in 90% of cases done using joints/bones. These consist of transformation (ie. rotation) information set within a hierarchical structure, called a skeleton. Being hierarchical means that if you move the parent, all the children move along with it. Rotate the elbow and the arm, hand, fingers all go together.
Animators manipulate this skeleton into various animation frames that are interpolated and played back at runtime.

Skinning is the process where you take the joint data and the 3D model, and transform each vertex (3D point that are linked together to define polygons) in the model by the joints that affect it. Vertices in the hand are affected by the hand joints, etc.

This is a relatively expensive process, more expensive the more joints influence a vertex. Many games limit the number of influences to 4.

Now, on to the platform specific part :)

A generic way to do the skinning, is to use the vertex shader. A joint palette is sent to the shader, together with the influence weights and the rest of the normal shader data. The GPU then crunches everything together to produce the final vertex position.
The problem with this method comes with the fact that you usually need to render the same model more than once. For example, for rendering into the shadow map, rendering into a GBuffer, etc. That means the GPU has to do the skinning more than once, which is a bit wasteful.

On the PS3 for example, you have an different option. You can use the SPUs to do the skinning very efficiently, store the resulting data, and pass that to the GPU. While this is great for the GPU, especially if you are vertex bound, it requires extra RAM to store the data. Depending on the game, you might not be able to spare that memory.

On the WiiU, you can use the GPU's ability to stream out data to export the skinning data in the same way as the SPUs on the PS3, and reuse it later. The advantage on the WiiU is that you have a lot more memory, so storing this data should be easy enough for most cross platform titles.

Sorry for the mega post, hope this is interesting to someone.
Donnie
Member
(01-23-2013, 02:25 PM)

Originally Posted by tipoo

One whole jaguar core for sound doesn't strike me as right, Jaguar isn't as powerful as Bulldozer or Ivy Bridge but if it takes a few percentage points of one of those cores to use, I can't imagine it taking 100% of a Jaguar core. It's slower, but not by so much.

And if it did, why wouldn't they include a DSP which is so tiny and cheap to produce?

EDIT: According to this, the nextbox does have audio acceleration

http://www.eurogamer.net/articles/df...box-specs-leak

I don't think these cores really compare to Bulldozer or Ivy Bridge to be honest. One Bulldozer core is about the size of 5 Jaguar cores (well the same size as 7 but there's a process difference to take into account so 5 is more accurate). As far as audio processing goes it can be surprising how much it can hit a CPU. I remember it being argued that no way would audio even use a significant portion of a 3.2Ghz Xenon core, in reference to talk of WiiU's DSP. Until a MS engineer confirmed it often takes one entire core to process audio. Now I'm not saying audio processing would take an entire Jaguar core, just that it wouldn't surprise me if it could take close to that, these aren't powerful cores after all.

You're right, that article does mention audio codecs, though I was under the impression that was for Kinect? I suppose we'll have to wait and see to what extent its used within the system.

If they haven't included a DSP for games then I'd guess the reason would be because they have 8 CPU cores there. They could have included a DSP for 360 but they had a triple core CPU so decided to let that do everything.
Last edited by Donnie; 01-23-2013 at 03:10 PM.
Donnie
Member
(01-23-2013, 02:32 PM)

Originally Posted by Fourth Storm

I'm gonna throw a number out there. 70.4 GB/s This, I believe, is the bandwidth from the Wii U GPU to its eDRAM.

Earlier in the thread, UX8GD eDRAM from NEC/Renesas was brought up as a leading (almost surefire) candidate for the Wii U GPU's on-chip memory. It comes in several configurations, but one now strikes me as most likely: 4 x 8MB macros, each with a 256-bit bus. What makes this interesting is that very early on there was talk of the first Wii U dev kits containing and underclocked RV770LE. This news got our hopes up, but 640 shaders and the like are now out of the question and indeed, never seemed to be in the cards. Why wouldn't they have just used a weaker, smaller, and cooler pc part then (I'm assuming the report was true)? Well, the bandwidth of that card to its on board GDDR3 just happens to be 57.6 GB/s. What's the bandwidth of eDRAM on a 1024-bit bus at 450 Mhz (the reported clock of Wii U dev kits up until late 2011)? 57.6 GB/s. I know it's been hypothesized before, but it seems increasingly likely those first dev kits utilized that particular Radeon to simulate the bandwidth to Wii U's MEM1. Since they upped the speed to 550 Mhz, it should now clock in at 70.4 GB/s for better or worse.

Wasn't WiiU's GPU originally 400Mhz? Sorry to ruin the hypothesis a bit but I do remember it being mentioned that the system was originally 400Mhz GPU and 1GHZ CPU.
Last edited by Donnie; 01-23-2013 at 02:38 PM.
ozfunghi
Member
(01-23-2013, 02:52 PM)
ozfunghi's Avatar
Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol
tipoo
Banned
(01-23-2013, 03:07 PM)

Originally Posted by ozfunghi

Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

You mean this?

http://www.youtube.com/watch?feature...&v=K40orKpbJBU

It does look pretty good, but still nothing that would be mind blowing on the HD 7th gen consoles imo. The textures and geometry are obviously far more than what the Wii could do, but that's about all I could say.
tipoo
Banned
(01-23-2013, 03:15 PM)

Originally Posted by Donnie

I don't think these cores really compare to Bulldozer or Ivy Bridge to be honest. One Bulldozer core is about the size of 5 Jaguar cores (well the same size as 7 but there's a process difference to take into account so 5 is more accurate). As far as audio processing goes it can be surprising how much it can hit a CPU. I remember it being argued that no way would audio even use a significant portion of a 3.2Ghz Xenon core, in reference to talk of WiiU's DSP. Until a MS engineer confirmed it often takes one entire core to process audio. Now I'm not saying audio processing would take an entire Jaguar core, just that it wouldn't surprise me if it could take close to that, these aren't powerful cores after all.

You're right, that article does mention audio codecs, though I was under the impression that was for Kinect? I suppose we'll have to wait and see to what extent its used within the system.

If they haven't included a DSP for games then I'd guess the reason would be because they have 8 CPU cores there. They could have included a DSP for 360 but they had a triple core CPU so decided to let that do everything.


That's true, but I suspect Jaguar is still much better core for core than Xenon even at half the clock speed, Xenon had some pretty big inefficiencies (remember that was during the Pentium 4 era). And from what I understood of the article, the sound accelerator in Durango would handle noise cancellation for Kinect as well as other audio encoding/decoding.

By the way, was it ever settled whether audio on 360 was taking one physical core, or just one thread of which each core has two of? And even with 8 cores it seems like Microsoft wouldn't want to waste 1/8th of its power just doing audio when a piece of silicon that costs so little and draws so little power could do the same. SoCs even have them integrated on-chip, they're so trivially small now.
phosphor112
Banned
(01-23-2013, 03:17 PM)
phosphor112's Avatar
I wanted to come and post how awesome the new games looked.



pestul
Member
(01-23-2013, 03:18 PM)
pestul's Avatar

Originally Posted by ozfunghi

Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

I don't think it's spectacular, but the open-world is very promising. Also the fact that it's probably an online multiplayer game with those visuals is quite impressive.
phosphor112
Banned
(01-23-2013, 03:23 PM)
phosphor112's Avatar
It's not anything too fantastic, but you can tell there is a fidelity increase over PS3 and 360. I don't know though. Xenoblade looked freaking awesome on Wii, and we know how weak that device is. Maybe it's partly art direction? Looks great anyway.
Van Owen
Banned
(01-23-2013, 03:23 PM)
Van Owen's Avatar

Originally Posted by ozfunghi

Anything new to gather from Xenoblade 2's trailer? I see the same huge environments as the Wii game, but everything looks... great, i guess. Does this tell us something about bandwidth? Or about anything? Anything at all? lol

It looks like it could be done on 360.

Not to say it looks bad per say.
FLAguy954
Junior Member
(01-23-2013, 03:24 PM)
FLAguy954's Avatar

Originally Posted by phosphor112

I wanted to come and post how awesome the new games looked.



That shore on the second gif needs some work. Something positive I noticed about the second gif, however, is how the distant areas are clear and easily viewable, a good sign I say.
lherre
Accurate
(01-23-2013, 03:25 PM)
lherre's Avatar
Sound in MS and Sony next consoles won't be managed by the cpu's.
Donnie
Member
(01-23-2013, 03:25 PM)

Originally Posted by tipoo

That's true, but I suspect Jaguar is still much better core for core than Xenon even at half the clock speed, Xenon had some pretty big inefficiencies (remember that was during the Pentium 4 era). And from what I understood of the article, the sound accelerator in Durango would handle noise cancellation for Kinect as well as other audio encoding/decoding.

By the way, was it ever settled whether audio on 360 was taking one physical core, or just one thread of which each core has two of? And even with 8 cores it seems like Microsoft wouldn't want to waste 1/8th of its power just doing audio when a piece of silicon that costs so little and draws so little power could do the same. SoCs even have them integrated on-chip, they're so trivially small now.

Jaguar could very well be faster in many ways than Xenon core for core even at 1.6Ghz. Though I don't think its going to blow it away or anything.

It was definitely one core, though I think he said that some much less complex games got away with one thread.

I completely agree that a DSP sounds the right way to go, hopefully they have gone that way for their own sake, because IMO its a waste not to.
Donnie
Member
(01-23-2013, 03:31 PM)

Originally Posted by lherre

Sound in MS and Sony next consoles won't be managed by the cpu's.

So they did learn a lesson. Any info on how the OS effects the CPU's in these consoles? As in the claim that 2 cores are set aside for OS and Apps in XBox 3?
ozfunghi
Member
(01-23-2013, 03:41 PM)
ozfunghi's Avatar

Originally Posted by Van Owen

It looks like it could be done on 360.

Not to say it looks bad per say.

Yeah, i'm specifically talking about the scope, i don't know if there is a 360 game that pushes the same scale with that level of detail (not saying there isn't, just that i don't know).

PS: don't want to act like a douche, but you're the 3rd person i've seen posting it this week. It's not "per say", it comes from latin, it's "per se" (in/by itself). It specifally comes off strange since i'm not a native English speaker, yet we also use the expression "per se" in our language, so it stands out more i guess when reading it spelled like that.

Originally Posted by phosphor112

I wanted to come and post how awesome the new games looked.

Blu will have your blood.


Originally Posted by phosphor112

Why?

Posting screens/gifs inevitably lead to other people showing other pictures of other games on other hardware to compare and a flame fest follows. Come to think of it, maybe he didn't like me mentioning the game at all either, lol.
Last edited by ozfunghi; 01-23-2013 at 04:03 PM.
phosphor112
Banned
(01-23-2013, 03:43 PM)
phosphor112's Avatar

Originally Posted by ozfunghi

Blu will have your blood.

Why?
tipoo
Banned
(01-23-2013, 04:03 PM)

Originally Posted by ozfunghi

Yeah, i'm specifically talking about the scope, i don't know if there is a 360 game that pushes the same scale with that level of detail (not saying there isn't, just that i don't know).

I can't think of one at the moment, but GoW 3 on PS3 comes to mind. If you can really fly to any point in the horizon you see there that is huge though.
phosphor112
Banned
(01-23-2013, 04:07 PM)
phosphor112's Avatar

Originally Posted by tipoo

I can't think of one at the moment, but GoW 3 on PS3 comes to mind. If you can really fly to any point in the horizon you see there that is huge though.

Just Cause 2? There is a lot more in terms of models and things on screen though.

https://www.youtube.com/watch?v=k_C6TjwxbGI
Fourth Storm
Member
(01-23-2013, 04:47 PM)
Fourth Storm's Avatar

Originally Posted by Donnie

Wasn't WiiU's GPU originally 400Mhz? Sorry to ruin the hypothesis a bit but I do remember it being mentioned that the system was originally 400Mhz GPU and 1GHZ CPU.

Doesn't ruin it really, as the bandwidth of the Radeon card would be used as a target. We know the early dev kits had to be clocked below target because of overheating.
OG_Original Gamer
Member
(01-23-2013, 05:09 PM)
So, I guess we can confirm the eDRAM bandwidth is XXXGB/s.
Fourth Storm
Member
(01-23-2013, 05:36 PM)
Fourth Storm's Avatar

Originally Posted by OG_Original Gamer

So, I guess we can confirm the eDRAM bandwidth is XXXGB/s.

No.
OG_Original Gamer
Member
(01-23-2013, 07:11 PM)

Originally Posted by Fourth Storm

No.

Well, If Shinen says it is, or suggest it is, then its enough for me, its just them having dev kits that leads me believe them
NBtoaster
Member
(01-23-2013, 07:22 PM)
NBtoaster's Avatar

Originally Posted by phosphor112

I wanted to come and post how awesome the new games looked.



Background detail looks much better than something like Skyrim. Putting that RAM to use.
OG_Original Gamer
Member
(01-23-2013, 07:28 PM)
That's putting that eDRAM to good use.
NBtoaster
Member
(01-23-2013, 07:42 PM)
NBtoaster's Avatar

Originally Posted by OG_Original Gamer

That's putting that eDRAM to good use.

I doubt much geometric detail would be in eDRAM, that'll be for the framebuffer, shadows, maybe a few textures..

Though texture detail in the distance seems good too.
Last edited by NBtoaster; 01-23-2013 at 07:50 PM.
OG_Original Gamer
Member
(01-23-2013, 07:51 PM)

Originally Posted by NBtoaster

I doubt much geometric detail would be in eDRAM, that'll be for the framebuffer, shadows, maybe a few textures..

Though texture detail in the distance seems good too.

Don't forget shaders.
Zoramon089
Banned
(01-23-2013, 07:59 PM)
Zoramon089's Avatar

Originally Posted by Van Owen

It looks like it could be done on 360.

Not to say it looks bad per say.

Well unless you have a 360 game in mind that looks as nice with that scale (and really few games this gen have this sort of draw distance, except for probably AC, RDR and Skyrim and this looks better than those), I'm not sure it could


Originally Posted by Fourth Storm

No.

Well it's your word versus AN ACTUAL WII U DEVELOPER. I wonder whose has more weight (well, really, any weight)?
Fourth Storm
Member
(01-23-2013, 08:00 PM)
Fourth Storm's Avatar

Originally Posted by OG_Original Gamer

Well, If Shinen says it is, or suggest it is, then its enough for me, its just them having dev kits that leads me believe them

Where did they suggest it is?
guek
Member
(01-23-2013, 08:09 PM)
guek's Avatar
Here's the thing: even if the Wii U is a step up above 360/PS3, it's not far ahead enough to look like anything close to a "next gen leap". So people like Van Owen will always be able to say "eh, doable on last gen" without anything other than casual observance as their evidence.

Expect to hear it a lot in the coming years, regardless of how Wii U games look. You could have something like 1313 locked at 30fps/720p/no AA and people would say "meh, doable on 360" because hey, it wouldn't be far from the truth.

Thread Tools