BTW, what is the "4" in your formula for?
Jaguar can operate on 128 bit SIMD (ie 4 x 32 bit), whereas Espresso's limited to 64 bit (2 x 32 bit) "paired singles" SIMD.
BTW, what is the "4" in your formula for?
The CPU has no L3, the L2 is eDRAM. Comparable to the PowerPC A2 (PowerEN and Blue Gene Q).I thought eDRAM is only on the L3 and the 3MB of L2 is SRAM. 3MB of L2 is substantial, this isn't even counting the rest of the uncore.
Single precision floating point SIMD width for Jaguar (2 for Wii U, with paired singles).
Jaguar can operate on 128 bit SIMD (ie 4 x 32 bit), whereas Espresso's limited to 64 bit (2 x 32 bit) "paired singles" SIMD.
Where was this confirmed that the ARM did anything for the OS? and that the WiiU multitasks with games?
Of course it does.Ps3 didn't have programmable shaders did it?
Actually, it's a bit harsher (for Nintendo) that that: if they think the best way to use their resources is to develop games for the WiiU, they will do it. There might be a lot of games that could be profitable for WiiU, but other projects on the twins can prove more profitable even, and thus, no project would be ever greenlit by a big publisher on WiiU because better opportunities are to be sought somewhere else. It's an explanation that would hold particularly true in the actual case of the 3DS; it doesn't get games because S-E, Konami, EA, Ubi, Rockstar and the others think it's more profitable to put their eggs in other baskets. Which in the case of S-E is totally stupid.There's a big difference between 360-Wii ports and 720-Wii U ports. In the former case, there's a huge gulf in functionality, and a huge gulf in power. In the latter, there's a relatively small gulf in functionality, and a moderately large gulf in power. The Wii simply couldn't do things the 360 could do (ie programmable shaders), so games like CoD had to be rebuilt from the ground up for Wii. By contrast, a sufficiently scaleable engine should allow a single game to be cross-developed for Wii U, 720 and PS4.
Of course, the more important numbers when it comes to ports are financial numbers. Fundamentally, if a publisher believes that the revenue gained from porting a game to a platform exceed the costs of porting said game to said platform, then they'll pay for the port. If they don't, they won't.
Well, they had to change a lot:Imo it's likely that they didn't change much besides adding SMP support. At least up to now nothing hints at that.
Anyway, it's barely debatable that Jaguar's IPC are at least a bit higher. And then we have 8 instead of 3 cores and a 30% higher clock rate which adds up to a factor of 3.5.
I agree with oversitting that the CPU could be crucial when it comes to downports.
Sorry for the double post, but I had to get this off my chest.
After browsing the "games known for their visuals" thread someone reminded me of the conduit for Wii. For those that don't know, The Conduit was suppose to be a game promising Xbox 360 graphics on the Wii. They even made a video showcasing what their engine could do.
https://www.youtube.com/watch?v=5tovnipDToc
It obviously never matched the 360 1:1 but it was an impressive effort given the API they had to work with.
Now imagine if a developer went through the same effort for Wii U, trying to bring PS4/720 visuals to the console. It's unlikely to be 1:1 but with 3 multicore OOE processors, 1GB of RAM and a 2012 GPU, the difference between Wii U and PS4/720 should be far negligible this time.
The above needs the remark that due to x86-64 simd actually not having a MADD op, that ratio is for one cpu doing MADDs while the other does MULs & ADDs, which is not exactly equivalent coding-wise.It depends on what you are looking at, but in some metrics the CPU performance difference could be >7x.
I'm really curious about that. Would you care to be a bit more specific? I thought that Broadway was a shrinked-overclocked Gekko.Well, they had to change a lot:
No PPC750 ever supported SMP.
No PPC750 was ever manufactured at 45nm.
No PPC750 ever used eDRAM.
No PPC750 supported more than 1MB L2 cache.
No PPC750 was designed to run at clock speeds exceeding 1.1GHz.
Not to mention Gekko and Broadway were no off-the-shelf cores, either. Initial 750s didn't support paired singles, and Nintendo's versions have several dozen additional instructions - they were already more of a superset. And contrary to popular opinion, not even Broadway was just a shrinked Gekko. Some logic units were completely redesigned.
What was Wii's imbalance?
There is 8 of them as well as they being built on 28nm. The cores inside the Wii-u are probably just as small at 45nm.
Also if the wii-u cpu is based on the ibm 750, that architecture is 15 years old. Jaguar is almost completely new abide it does draw from the same design as bobcat. The speed that technology advanced in those years, its likely the wii-u is missing a lot of cpu.
Also, Espresso contains three cores and is 32.76mm^2. Granted, that's including cache, but IBM uses eDRAM which offers roughly three times the density compared to traditional SRAM.
I thought eDRAM is only on the L3 and the 3MB of L2 is SRAM. 3MB of L2 is substantial, this isn't even counting the rest of the uncore.
Of course it does.
Well, they had to change a lot:
No PPC750 was ever manufactured at 45nm.
No PPC750 ever used eDRAM.
No PPC750 supported more than 1MB L2 cache.
No PPC750 was designed to run at clock speeds exceeding 1.1GHz.
Wait now, when did we get to it having 3MB L2 cache? I thought it just had 3MB L3 cache made of eDRAM like other IBM designs. I've never heard of L2 and L3 cache being the same size.
That's not 'fixed function', that's 'non-unified'. Harder to utilize fully, but still fully programmable.Not in the sense that he probably meant, RSX had fixed function pixel/vertex shaders.
http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer'
How does wii u compare to orbis?
How does wii u compare to orbis?
It was menioned by IBM engineers in Linkedin profiles a while ago. Don't know what exactly was redesigned and why. The only unit specified in one profile was the adder.I'm really curious about that. Would you care to be a bit more specific? I thought that Broadway was a shrinked-overclocked Gekko.
That's pretty much my point. Espresso cores are by no means huge, but neither are other embedded cores like ARM - or Jaguar.Not sure which way you're leaning with this? A Core 2 Duo also on 45nm is over 120mm2 (just going by memory, somehwere around there), and I think Cell is still over 100mm2 on 45nm.
I wouldn't be surprised seeing changes at least comparable to Gekko, which was a very significant evolution of the "regular" PPC750CXe, extending the ISA by several dozen mosly graphics related instructions and sporting a heavily modified FPU introducing limited SIMD capabilities to the 750 line. Nintendo did it once, I see no reason for them not to do it again. If that leads to a smaller, more efficient chip, it would probably warrant the higher development costs.I would call those minor changes that don't imply any deeper overhaul of the acutal architecture. Far from exciting from a performance standpoint.
Of course we don't know for sure what else might have changed. It's just my guess that it won't be much, amongst others because the development costs would've been high.
That's a common misconception. A 100% efficient 70W PSU delivers 70W of power. A 80% efficient 70W PSU also delivers 70W of power only that it must draw 87.5W from the wall. For a good PSU it's efficiency affects it's power bill not it's output.
IBM uses that exact technology for two of their latest and greatest performance chips, so apparently not.Isn't L2 going to be much slower and harder to access with eDRAM?
Idea man comments about a next gen game once thought not possible on the WiiU is now not only possible but may be in development for the WiiU.
If we are talking about the CPU just for gaming, we need to take out the 2 cores that Durango was reported to be using for the OS. Wii U has a multi-core ARM processor for that. So that would be x5.33 just for gaming with your calculations.
BTW, what is the "4" in your formula for?
If we are talking about the CPU just for gaming, we need to take out the 2 cores that Durango was reported to be using for the OS. Wii U has a multi-core ARM processor for that.
But going from the scant info present, it appears the closest R700 GPU it resembles is the 4650/4670, which would be a clone of the 3850/3870 of a generation previously. Assuming being programmed as part of streamlined console would bring a moderate boost, the final performance would, at the most generous metric, be close to that a HD 4830. What does that mean? In most likelihood, we'll see close to max quality Crysis 1(c.2007-2008) visuals near the end of Wii U's lifespan......at 720p. Perhaps a bit higher resolution, but it will never reach max quality at 1080p, while Orbis and Durango most certainly can, right at debut.
IIRC the leaks for both Durango and Orbis indicate separate audio processing units.And possibly another core for audio, assuming a developer (Likely only Nintendo) uses the DSP.
Every developer using middleware, or at least fmod, for audio also uses the DSP. fmod even supports DSP/ CPU load balancing on Wii U, which is pretty awesome. Basically, if too much happens and the DSP can't keep up, it will overflow to the CPU before dropping stuff - if the CPU has cycles to spare that is.And possibly another core for audio, assuming a developer (Likely only Nintendo) uses the DSP.
IIRC the leaks for both Durango and Orbis indicate separate audio processing units.
It appears that no one has yet managed to crack open the GPU and see just what R700 (4000 series) model it most resembles, let alone benchmark it at all. It appears an absolute NDA regarding Wii U technical specs exists after all, just no publicly.
But going from the scant info present, it appears the closest R700 GPU it resembles is the 4650/4670, which would be a clone of the 3850/3870 of a generation previously. Assuming being programmed as part of streamlined console would bring a moderate boost, the final performance would, at the most generous metric, be close to that a HD 4830. What does that mean? In most likelihood, we'll see close to max quality Crysis 1(c.2007-2008) visuals near the end of Wii U's lifespan......at 720p. Perhaps a bit higher resolution, but it will never reach max quality at 1080p, while Orbis and Durango most certainly can, right at debut.
We're not talking about power draw from the wall. We're talking about rated output, which is afaik what the Wii U's PSU's 75w is labeled as.
But you are 100% correct.
I really hope the CPU and memory bandwidth don't cause issues and Nintendo have found some architecture to get around them both. I just can't see how this system is even going to be capable of exceeding PS3/Xbox 360 level graphics when the CPU and MEM1 bandwidth seem to gimped
IBM uses that exact technology for two of their latest and greatest performance chips, so apparently not.
Looking at Blue Gene/Q might be interesting in that regard, as that chip does some interesting things with its L2. Multi-versioning, atomic transactions and such.
POWER7 uses SRAM for L2. PowerEN and Blue Gene/Q use eDRAM.The L2 is also eDRAM in that? I thought power7 only used eDRAM for the L3 to buffer main memory.
POWER7 uses SRAM for L2. PowerEN and Blue Gene/Q use eDRAM.
Both those chips are quite exotic, so that isn't all that surprising. They're targeting very specific applications. PowerEN is for ultra high end networking equipment, and Blue Gene/Q is a real supercomputer. What's interesting is that Blue Gene/Q not only uses eDRAM for L2, it also does a few unusual things with its L2 as I mentioned.cool, did not know that.
Can anyone compare what we know about the WiiU with what we heard of VGLeaks about Durango and Orbis ? It seems the power gap won't be insurmountable but who am I to talk...
MEM1 is the EDRAM, not what's 12.8GB/s(MEM2)
Durango is two WiiU's duct-taped together.
This is actually kind of true.
Nope: http://www.vgleaks.com/world-exclusive-wii-u-final-specs/Pretty sure the leaked dev kit specs listed MEM1 as the DDR3 pool, not the eDRAM.
IBM uses that exact technology for two of their latest and greatest performance chips, so apparently not.
Looking at Blue Gene/Q might be interesting in that regard, as that chip does some interesting things with its L2. Multi-versioning, atomic transactions and such.
Durango is two WiiU's duct-taped together.
This is actually kind of true.
No.OK, the popular opinion is that between the PS2 and the XBOX three was a three times performance gap.
Durango is two WiiU's duct-taped together.
This is actually kind of true.
It's not slower than SRAM for certain amounts up.IBM uses it for high performance server and workstation chips that optimize for high throughput use. They bet on larger caches cancelling out the fact that eDRAM is slower than SRAM.
I don't know either, but it was certainly more R&D effort to go with eDRAM and the chip would cost the same if they used 1MB SRAM, so 3MB eDRAM being superior is the most logical conclusion.IBM uses it for high performance server and workstation chips that optimize for high throughput use. They bet on larger caches cancelling out the fact that eDRAM is slower than SRAM. It works well for them in high performance compute, that's one area where they beat even the mighty Intel.
That said I have absolutely no idea how it would scale to a tiny chip with 3MB of it at 1.2GHz. It's thrice as dense as SRAM, so the alternative would be 1MB of faster SRAM, I have no idea if being three times that capacity makes up for the lower speed in a gaming context.
Porting from 720/ps4 to Wii U will be somewhat in between porting from 360 to Wii and porting to the Vita. This assuming devs will be able to compress the texture ect to fit in the small memory space.