• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

deviljho

Member
Source? Anyone know if this is legit? Controller looks DS3-ish but elongated and not so thin in the middle part, but doesn't look like it has enough room for a screen or touchpad. Is that the dev kit controller or something the developer just uses for testing?

That just looks like the Wii Procontroller (not WiiU). And if Wsippel is posting it, i doubt it's fake.

Yup. That's a Classic Controller Pro.
 

ahm998

Member
No offense... but i'm a bit baffled by your input.

First, you come in this thread and others (the "Xenoblade 2" thread) and ask what WiiU is capable of. Then, just the same day, you seem like you are an expert on all things WiiU related and seem to know better than others what other people should expect, and you even have some ingenious upgrade (excuse the sarcasm) for the RAM in mind over USB.

Please, i don't mean to offend, but people in this thread are trying to maintain a higher level of quality in discussion. There are actual game developers in this topic, people who are ingeneers etc.

Sorry to make you baffled.

But i give my expectation from the spread spec. .

Until know there is no clear info for Wii GPGPU except RAM & CPU.

And about the other VGleak durango spec. and It's clear .
 

ozfunghi

Member
I think GPU for Consoles show polygon:

PS2: 15-20 Million.

GCN: 20+ million with good Cpu.

Xbox: 115 million.

Wii: double GCN 40+ million ....!

Here if we get the information for 720,Ps4,Wii U for GPU & CPU we can comparison.

Please, i implore you once more, stop these posts.
You are mixing theorethical raw numbers with real-life in-game numbers. Xbox did nowwhere near that number of polygons. I even doubt there was an xbox game that pushed more polygons than Rebel Strike on gamecube (+/- 19 million i believe).

Just keep this discussion on topic, and only state "facts" when you know what you are posting. Or ask questions.


Sorry to make you baffled.
But i give my expectation from the spread spec. .
Until know there is no clear info for Wii GPGPU except RAM & CPU.
And about the other VGleak durango spec. and It's clear .

Please, just calm down. It just doesn't make any sense that you are asking about the hardware in one post, and in the next you are telling other people what to expect or have wild ideas about what Nintendo should do.
 

ahm998

Member
Okay Batman wii U have many drop frame issue & not 60 fps is because developer lazy or wii U can't handle it?

Could you please explain it to me technically?
 
Because when i start searching about nintendo history there is only one way to match-up with Ps4 & 720p after the second year with low price.

m4668.jpg


This is what i am thinking about only usb for Wii U with 4 gb RAM after 2 years will be cheap.

Nintendo thinking how to make there consoles cheap too much.

Anybody believe my idea :p

Could this work?

A 4/8GB RAM box that you attach to your Wii U for increased performance?
I guess the issue is that the Wii U only have USB2 ports, so the max theoretical transfer speeds would be 35 MB/s.

Would that be useful?
 

Thraktor

Member
Could this work?

A 4/8GB RAM box that you attach to your Wii U for increased performance?
I guess the issue is that the Wii U only have USB2 ports, so the max theoretical transfer speeds would be 35 MB/s.

Would that be useful?

No.

The N64 was designed with RAM expansion in mind, because the 64DD's vastly increased seek-times meant they couldn't stream data in the same way as cartridges. So, Nintendo basically added what's equivalent to the RAM slots in your PC, with all the wiring and extra bus to go with it, in order to accommodate that upgrade. They haven't done this with the Wii U, because there's no need for an upgrade path, and if they put in a wider bus on the motherboard, they might as well just make use of it immediately (as DDR3 is cheap, whereas N64's RAMBUS was expensive).

When it comes to USB, you're literally looking at access time in the region of 10,000x higher than you need for RAM. There's not much point to RAM if, by the time you've got the requested data, you're already on the next frame!
 

ozfunghi

Member
Would that be useful?

no

Okay Batman wii U have many drop frame issue & not 60 fps is because developer lazy or wii U can't handle it?

Could you please explain it to me technically?

WiiU could not handle the port the way the developer wanted to port it.
Had the developer made the game from the ground up, or ported it according to the strengths of the hardware, the game would not have frame issues.

So you decide whether it is because the hardware is weak, or the dev is lazy.
 

Donnie

Member
Could this work?

A 4/8GB RAM box that you attach to your Wii U for increased performance?
I guess the issue is that the Wii U only have USB2 ports, so the max theoretical transfer speeds would be 35 MB/s.

Would that be useful?

USB just couldn't support any kind of worthwhile RAM, no matter whether its USB 2 or USB 3.

As I said earlier you might as well just use some of the built in storage as that would most likely be much faster (especially latency which is key for RAM). But it still wouldn't be effective in simulating extra RAM.
 

tkscz

Member
Okay Batman wii U have many drop frame issue & not 60 fps is because developer lazy or wii U can't handle it?

Could you please explain it to me technically?

Wish this phrase would die. The devs aren't lazy, they are inexperienced and low budgeted. Batman, along with other straight ports, were made with the 360/PS3's high CPU/RAM clock speeds in mind. The WiiU as lower clock speeds, but if you read this thread, the WiiU has ways around that. Those ways wouldn't be used in a straight low budget port, and the fact that it CAN run these ports at all should tell you it can handle it. Additionally, the devs probably wouldn't know how to use the WiiU's hardware in only about a years time. Remember, most 360 games looked awful until around 2008 (it was released in 2005).

Also, PS360 version of Batman isn't 60fps either.
 

tipoo

Banned
Could this work?

A 4/8GB RAM box that you attach to your Wii U for increased performance?
I guess the issue is that the Wii U only have USB2 ports, so the max theoretical transfer speeds would be 35 MB/s.

Would that be useful?



That's slower than a native standard hard drive. No, that would not be useful, except as a large cache I guess. The point of RAM is to provide the system access to the most needed data much faster than the hard drive/flash storage can. If it's slower than the main storage, it is pointless. You could just use secondary storage like an external hard drive if you were doing that and be just as fast/slow. Which then wouldn't really be RAM, it would be more like a swap file, and consoles don't do that due to inconsistent performance.

Basically, no.
 

Donnie

Member
no



WiiU could not handle the port the way the developer wanted to port it.
Had the developer made the game from the ground up, or ported it according to the strengths of the hardware, the game would not have frame issues.

So you decide whether it is because the hardware is weak, or the dev is lazy.

Does Arkham City have v-sync enabled on any of its versions (do any have screen tearing?).
 

Shaanyboi

Banned
Source? Anyone know if this is legit? Controller looks DS3-ish but elongated and not so thin in the middle part, but doesn't look like it has enough room for a screen or touchpad. Is that the dev kit controller or something the developer just uses for testing?


That just looks like the Wii Procontroller (not WiiU). And if Wsippel is posting it, i doubt it's fake.


It's certainly a possibility that's the real deal. There are quite afew developers that had been quoted saying "Yeah, when we first got the dev kits, we were just using a Classic Controller Pro for a while."

I don't know if that's still the case (I'd hope not), but it was for a time.
 
Xbox did nowwhere near that number of polygons. I even doubt there was an xbox game that pushed more polygons than Rebel Strike on gamecube (+/- 19 million i believe).

Back in the day, there was an Alias (software company) email list serve for Maya (a 3D art and animation program) that I was on. A dude from Factor 5 posted on it now and then also, and I remember asking him about the two Rogue Squadron games. If I remember right the first game was doing 14 - 16 million and he said with the second one they were getting close to doubling that in some areas. So I would imagine Rebel Strike was in the mid to high 20s.
 

Thraktor

Member
That's slower than a native standard hard drive. No, that would not be useful, except as a large cache I guess. The point of RAM is to provide the system access to the most needed data much faster than the hard drive/flash storage can. If it's slower than the main storage, it is pointless. You could just use secondary storage like an external hard drive if you were doing that and be just as fast/slow. Which then wouldn't really be RAM, it would be more like a swap file, and consoles don't do that due to inconsistent performance.

Basically, no.

Even as a cache, USB drives would be fairly useless. Something along the lines of PCM (phase-change memory) would work for that, as it's sort of half-way between RAM and flash, but it'd have to be built onto the motherboard to be of any use.
 

tipoo

Banned
Even as a cache, USB drives would be fairly useless. Something along the lines of PCM (phase-change memory) would work for that, as it's sort of half-way between RAM and flash, but it'd have to be built onto the motherboard to be of any use.



Exactly, both that an a hard drive would be just as slow under USB 2.0. So would actual RAM. There's no point in anything faster if the interface limits it. But a cache could be moderately useful as that's still faster than the Wii U optical drive and with lower seek latency probably, but it's still wasteful and not a big benefit.
 

Donnie

Member
Exactly, both that an a hard drive would be just as slow under USB 2.0. So would actual RAM. There's no point in anything faster if the interface limits it. But a cache could be moderately useful as that's still faster than the Wii U optical drive and with lower seek latency probably, but it's still wasteful and not a big benefit.

Yeah and the internal storage would be better suited for that purpose anyway.
 

Thraktor

Member
Back in the day, there was an Alias (software company) email list serve for Maya (a 3D art and animation program) that I was on. A dude from Factor 5 posted on it now and then also, and I remember asking him about the two Rogue Squadron games. If I remember right the first game was doing 14 - 16 million and he said with the second one they were getting close to doubling that in some areas. So I would imagine Rebel Strike was in the mid to high 20s.

Interestingly (and somewhat on-topic), this is actually the reason we're here now. Nintendo did a full specs release for the Gamecube, where one of the main specs was a "real world" value of 12M polys/sec, fully textured, with lighting, etc. Sony and MS had given entirely theoretical numbers of 77M and 125M polys/sec respectively for their consoles, but these were untextured, unlit polygons that you couldn't even draw to screen. Now, even though the Gamecube launched with a game that did a reported 15 million polys/s (and very nice polys, I might add) compared to apparently about 2-3 million polys/s for the top PS2 games at the time, the majority of the gaming press was absolutely convinced that the PS2 was far more powerful. Ditto with the XBox (which was, admittedly, a more capable machine, but not for poly count reasons). This lasted the whole way through the gen, and there are still people who think the PS2 was the more powerful console. So basically, Nintendo went the high-tech route, they managed to create a console that could produce some really nice graphics, they did a full reveal of the specifications, and they got none of the benefit from it, with the console being portrayed during its lifespan as a weak, kiddy console that couldn't compete with PS2.

So, that's where we are now. Nintendo learnt that the tech race is something they aren't going to win, and they're better off with more modest, low power consoles that they never release the specs for.
 

AlStrong

Member
Thanks. I've been looking at die shots like those for the past year or two, so I'd be fairly confident in working out a standard R700-era die. The problem is that this isn't a run-of-the-mill GPU, as there's eDRAM on there, along with a DSP, a couple of ARM cores, etc, and for all we know AMD have made significant changes to the graphics core itself. Also, as far as the "ROPs next to memory controller" rule, there are two banks of memory in play here, the eDRAM and DDR3, and one challenge will be figuring out how the GPU interfaces with each. Now, of course this is all the stuff that makes the analysis interesting, but it's also the stuff that makes the analysis tricky, at least from my perspective.

You've no interest in doing the honors for us, do you?

True, though I think there should be enough recognizable "radeon" bits in there to get an idea of the GPU-proper. Something like the DSP and ARM etc will essentially just be bolted on ( ARM proc may be pretty obvious on its own), or at least unique enough from the meat of the GPU (shader arrays). It's a similar idea with the other fixed function stuff like UVD & display controllers etc for the PC GPUs.

I'd expect the eDRAM to be very obvious. :)

Maybe Chipworks' report/analysis will help things along?

Anyhoo, my schedule isn't going to be good for the next while, so I can't do it, but thanks. :) Just thought I could chime in to show it shouldn't be a daunting task.
 

joesiv

Member
From the information above can't believe how X project impressed me with weak CPU & RAM.

I don't believe so, possibly the 32MB EDRAM
As for the old CPU, it's somewhat comparable to bobcats age, both of these chips are modern and modified, but bobcat iirc was designed from AMDs decade old K8, and while IBM's ppc 750 series is older, it was designed with minimal power and size while K8 was a desktop CPU that wasn't designed with these restrictions. Jaguar is an evolved bobcat core if I've been following the leaks correctly, and only sees a 10-15% increase of performance over bobcat, Wii U's CPU is likely to out perform it per clock, but with double the cores Durango will easily win out in performance. (6 cores for games)

I would add that Nintendo spend millions maybe hundreds of millions of dollars licensing the IBM tech, and working with IBM engineers (along with their own engineers) to modify the power PC CPU for the GameCube. They modified it so that it would be exactly what they wanted for a gaming CPU, and in it's day it WAS a very good CPU for gaming. On the other hand in those days, Microsoft took an off the shelf pentium CPU and crippled the cache AFAIK.

For the Wii, they took the gamecube CPU tweaked it further, and upped the clock, probably again spending millions, though at this moment I believe they still owned the license to the tech so they wouldn't have to spend as much.

For the Wii U, it seems Nintendo may have wanted to not throw away the hundreds of milllinos of dollars invested in this custom CPU that they purpose built for gaming, and for their compilers. They know everything there is to know about the CPU, and very well know the performance to expect with the Wii U.

On the other hand it seems that Sony/Microsoft are going with the AMD route for CPU, and while Nintendo *could* do that too, especially since they are already working with AMD for the GPU (original ArtX with the gamecube GPU which was later bought by ATI, then merged with AMD), they would essentially be throwing away all that invested money. And they would likely have to learn a new architecture, and probably want to tweak it again. And the terms of licensing may not work well for Nintendo's profit model. AFAIK, once you license the IBM technology for a CPU, you can do what you want with it, tweak it, and even fab it anywhere you want, it's yours. I know for sure Intel does not run under that model, you need to purchase the chips off them (can't fab your own), which would probably mean you can't do much modifications either. AMD, with Sony seem to be doing quite the colaboration, but they probably are more similar to Intel, as they'd want you to fab with them, at the prices they dictate, probably ok short term, but more expensive long term.

Just speculation, but outside of performance, perhaps another take on why Nintendo is continuing with a "weak CPU".
 

Durante

Member
Can anyone explain what information you get from the pictures you want to buy? ;)
Primarily 3 numbers: the amount of SIMD clusters, RBEs and TMUs. Those should be almost certainly possible to decipher, anything else would just be a bonus.

From these and the previously revealed clock rate, it would be possible to provide theoretical computation, fill rate and texture sampling numbers.
 

OryoN

Member
Just speculation, but outside of performance, perhaps another take on why Nintendo is continuing with a "weak CPU".

Seems reasonable. Additionally, when you've been using an modifying the same core CPU for 3 generations in a row, I would imagine that there's an incredible reward that comes with that in terms of getter closer and closer to the "perfect" gaming CPU. This recent modification seems to be the most extreme by far, and I'm sure that Nintendo devs/engineers finally got their CPU "wish-list" granted (though, only as much as budget allows).
 

Meelow

Banned
Curious, from what we get from those rumored Xbox 720 specs, does the 720 beat everything the Wii U can do or does the Wii U have something over the 720?

Sorry for the lame question.
 
Seems reasonable. Additionally, when you've been using an modifying the same core CPU for 3 generations in a row, I would imagine that there's an incredible reward that comes with that in terms of getter closer and closer to the "perfect" gaming CPU. This recent modification seems to be the most extreme by far, and I'm sure that Nintendo devs/engineers finally got their CPU "wish-list" granted (though, only as much as budget allows).
You know they probably just wanted to keep the work environment and backward compatibility locked down; hence the cpu being code-compatible with gekko and not a new architecture in any way, no SMT, no new SIMD, no real bells and whistles other than good performance per clock on a hard to scale up architecture (hence, being kept on the 1.2 GHz mark).

The CPU side being kept the same means they can recycle their engine tech, for instance I doubt they're messing with Wind Waker just because, or rather, they're messing with it because the new game is on a experimenting stage and thus there's no real art assets going for it (probably too soon for that to be locked down too) hence they're upgrading their engine tech for the Wii U using Wind Waker.

Case in point: The engine is the same for Wind Waker, Twilight Sword and Skyward Sword, it's evolution being mostly straight forward taking the graphic overhead being at least partially rewritten for each game.


So, what really changes here is the fact that they have 3 higher clocked cores now, the engine is most certainly not multithreaded for that, they have a different sound DSP at place, and that the GPU is totally different.

GPU being totally different changes some stuff, like in itself freeing more resources on the CPU side even if it was clocked at the same speed. Namely, no use in tracking Vertex coordinates (vertex shaders being part of this GPU architecture) and tracking bone rigging on the CPU side anymore.


They have to adapt to that and bring their tech up to notch, but that's certainly easier to do than if the GPU architecture was different or any more different. This way it's still an evolution of sorts that enables them to keep part of it the way it is (and by doing just that reduce development times and tech R&D)



They certainly didn't really modify it all that much for the Wii too, they simply core shrinked it and upped the clock (probably changed from a PPC 750 cxe to a 750 cl core and reimplemented the original changes/optimizations done for the GC); this is the first time they've went past the PPC 750GX specification of 1.1 GHz certainly due to this cpu being now deprecated and thus not further evolved; but with a 45nm process they probably could go even further on the clockrate without any further changes too.
 

Datschge

Member
AMD, with Sony seem to be doing quite the colaboration, but they probably are more similar to Intel, as they'd want you to fab with them, at the prices they dictate, probably ok short term, but more expensive long term.

AMD no longer owns fabs and I'm not sure if they even want to control themselves where chips are made considering the issues GlobalFoundries and others continue to have. I think it's purely a licensing term, just that AMD can't be excluded from the future ownership of the design if it includes x86 ISA as afaik that's a non-forwardable cross-license with Intel.
 
Curious, from what we get from those rumored Xbox 720 specs, does the 720 beat everything the Wii U can do or does the Wii U have something over the 720?

Sorry for the lame question.
Well we don't know if Xbox 720 has a dedicated sound CPU (or at least I haven't heard of it) so there's that.


Other than that, and saying the obvious "you don't have a controller with a screen on the Xbox 720" then no, not really. GPU is supposed to be very custom though, and that might mean there's some shit hardwired in there that can be done for almost for "free", like on the 3DS's Pica200 Maestro-2G; if some effects in there were done on shader units it simply wouldn't be powerful enough for that, hence they're fixed function effects being executed on a fast track; that won't really put them on an advantageous position against said consoles but it might help level it out seeing the others are wasting more muscle to recreate it via the programable way.

It has been said that GC also benefitted from the same thing, as "cutting edge" stuff like volumetric fog was somehow hardwired in there (and the never enabled "free" stereoscopic 3D support).


Speaking of other "theorectical" advantages that could make it more efficient, the cpu/gpu bridge on the GC/Wii was heavily optimized for the time as it could interpret and deal with compressed data from point to point (such as vertex data) without having to software uncompress and recompress via software in order to send it (and compressing it would reduce the bandwidth needs; which incidentally is why the tesselation unit on the X360 is useless as it will only bloat vertex coordinate data by increasing detail that way and isn't able to deal with compressed data; further revisions of the Tesselation part on the Rxxx line changed the prefered data format for a compressed one).

This console seems to be kinda bandwidth famished when it comes to the main RAM and the link and keeping the latency down between the CPU and the GPU seems to be a priority, so they probably invested quite a bit of time and effort there, reducing any bottlenecks that might arise. Memory controller is probably also very efficient.
 

wsippel

Banned
Source? Anyone know if this is legit? Controller looks DS3-ish but elongated and not so thin in the middle part, but doesn't look like it has enough room for a screen or touchpad. Is that the dev kit controller or something the developer just uses for testing?
It's legit. A developer released promotional material and forgot to move the devkit out of view. No idea which revision it is. Probably v4. The earliest kits (V10001) were black and I think a little smaller. As already stated, the controller is a Wii Classic Controller Pro.
 

Meelow

Banned
Well we don't know if Xbox 720 has a dedicated sound CPU (or at least I haven't heard of it) so there's that.


Other than that, and saying the obvious "you don't have a controller with a screen on the Xbox 720" then no, not really. GPU is supposed to be very custom though, and that might mean there's some shit hardwired in there that can be done for almost for "free", like on the 3DS's Pica200 Maestro-2G; if some effects in there were done on shader units it simply wouldn't be powerful enough for that, hence they're fixed function effects being executed on a fast track; that won't really put them on an advantageous position against said consoles but it might help level it out seeing the others are wasting more muscle to recreate it via the programable way.

It has been said that GC also benefitted from the same thing, as "cutting edge" stuff like volumetric fog was somehow hardwired in there (and the never enabled "free" stereoscopic 3D support)

Interesting, so if we take the rumors as true, etc does it seem more like WiiU/PS4/X720 are more like 6th gen than 7th gen?
 

AzaK

Member
So has there been any progress on this in private? We MUST have someone right? I think Fourth Storm offered to do it. FS, do you thnk you can read the pics or will you need to team up with someone?
 

Schnozberry

Member
I believe both Durango and Orbis have Audio DSP's, if the leak threads are accurate. Otherwise, they'd eat up a couple of CPU threads just processing audio.

I'm very interested to see what was customized on the U GPU. I'm not hoping for miracles or anything, but a pleasant surprise or two would be fun.
 
I mean the difference like Dreamcast to PS2 to GameCube to Xbox and not the difference between Wii to PS3/360.

You're looking at it too broad. There were several shifts to scalable engines and towards more singular architecture since the PS2 era. The Wii U will be technically capable of anything the PS4 can do, even if its 1/4th to 1/15th the speed, its technically capable.

However, the PS4 is in a whole nother league, if leaked specs are true, to the Wii U, and it probably wouldn't be fiscally sound to attempt to do a game thats even near a playable faithful representation of the PS4 game on Wii U.
 

Schnozberry

Member
I mean the difference like Dreamcast to PS2 to GameCube to Xbox and not the difference between Wii to PS3/360.

"Graphics uber alles" has never decided commercial success. Trying to contrive comparisons to prior console generations is also largely a fools errand, as the gaming community and the world economy were much different back then.

I think I might be the only one on GAF who thinks Durango and Orbis are going to be huge albatrosses for the industry if Sony and Microsoft can't keep the price down.

In any case, let me know where to send my money for the purchase of that photo.
 
So has there been any progress on this in private? We MUST have someone right? I think Fourth Storm offered to do it. FS, do you thnk you can read the pics or will you need to team up with someone?

It should be easy to get the SPUs and TMUs. I would have to look at some ARM die shots in oder to get a better idea of them. ROPS I have been having a hard time discerning. Honestly, after the Orbis leak, 8 ROPs seems more and more a lock.

But we still are not positive which chip refers to the GPU. I am still awaiting a reply from chipworks to see if they could volunteer any info like resolutions or file size so we can make a better assessment before we plunk down the cash.
 

Donnie

Member
I mean the difference like Dreamcast to PS2 to GameCube to Xbox and not the difference between Wii to PS3/360.

Its really hard to use past consoles to describe the differences between current consoles and generation numbers are too vague in their definition.

I will say however that its not as big a difference as that of Wii and PS3/360 (though I think the difference between Sony and MS's new consoles is going to be bigger than last time out by the looks of it).
 
Oh I get what he's trying to say. Im my random BS made up estimation, the difference between Wii U and Orbis is wider than Dreamcast to Xbox, but the Wii was functionally handicapped in addition to having gimped horsepower vs XB360. Not a fair comparison.
 
Top Bottom